Showing posts with label you. Show all posts
Showing posts with label you. Show all posts

Sunday, August 28, 2011

Confused

What is confusion?





According to Collins English Dictionary, the main meaning of the word "confused" is:

confused [kənˈfjuːzd] adj
1. feeling or exhibiting an inability to understand; bewildered; perplexed
That sounds about right. But hang on. Isn't there something odd about this: "feeling or exhibiting an inability to understand..."?



Those are two completely different things. Sometimes people exhibit a lack of understanding and don't feel it - they think they understand, but actually they don't. Indeed, that's the worst kind of confusion, because it leads to people making mistakes based on wrong assumptions. Whereas feeling confused is much less of a problem. If you know you're confused, you won't go around acting as if you're not.



The feeling of confusion happens when you've just avoided being confused, or just come out of it. Confusion is a feeling, and also, a status, and the two are not just separate but (to some extent) mutually exclusive. If you feel confused, you can't actually be seriously confused.



Yet we use the same word for both, and the dictionary treats them both as being not just the same but part of the same definition. Confusing.



Or take being drunk. "Drunk" is a feeling, certainly. It's also a state, and they only sometimes go together. You can be drunker than you feel, with hilarious or tragic consequences. Everyone knows that you can't trust a drunken person to know how drunk they are.



Consider "depression". Depression is a feeling. No question about that. We've all felt at least a little depressed. Depression is also a state, that certain people go into as a result of mental illness, physical illness or as a side effect of certain drugs.



But the state of depression is no more equivalent to the feeling of depression than being confused means feeling confused. In my experience of depression, feeling depressed is a sign that I'm only slightly depressed. When I'm really depressed, I don't think I'm depressed at all.



This is one of the most insidious things about depression: it 'creeps up on you'. Over a period of time - usually several days, in my case, but it can be much longer or shorter - your mind changes.



You stop noticing opportunities, and become obsessed with risks.
Your ability to take decisions and come up with ideas withers and your imagination fails you. Your thoughts get stuck in loops. You feel weary doing the things you used to enjoy and angry around people you used to like.



In other words, your mind changes. Your memory, thinking and perceptions are all altered - but you don't notice that. You notice the effects, of course, but you think they're outside: you think the world has suddenly become less friendly. A classic case of confusion, in the worst sense.

Confused

What is confusion?





According to Collins English Dictionary, the main meaning of the word "confused" is:

confused [kənˈfjuːzd] adj
1. feeling or exhibiting an inability to understand; bewildered; perplexed
That sounds about right. But hang on. Isn't there something odd about this: "feeling or exhibiting an inability to understand..."?



Those are two completely different things. Sometimes people exhibit a lack of understanding and don't feel it - they think they understand, but actually they don't. Indeed, that's the worst kind of confusion, because it leads to people making mistakes based on wrong assumptions. Whereas feeling confused is much less of a problem. If you know you're confused, you won't go around acting as if you're not.



The feeling of confusion happens when you've just avoided being confused, or just come out of it. Confusion is a feeling, and also, a status, and the two are not just separate but (to some extent) mutually exclusive. If you feel confused, you can't actually be seriously confused.



Yet we use the same word for both, and the dictionary treats them both as being not just the same but part of the same definition. Confusing.



Or take being drunk. "Drunk" is a feeling, certainly. It's also a state, and they only sometimes go together. You can be drunker than you feel, with hilarious or tragic consequences. Everyone knows that you can't trust a drunken person to know how drunk they are.



Consider "depression". Depression is a feeling. No question about that. We've all felt at least a little depressed. Depression is also a state, that certain people go into as a result of mental illness, physical illness or as a side effect of certain drugs.



But the state of depression is no more equivalent to the feeling of depression than being confused means feeling confused. In my experience of depression, feeling depressed is a sign that I'm only slightly depressed. When I'm really depressed, I don't think I'm depressed at all.



This is one of the most insidious things about depression: it 'creeps up on you'. Over a period of time - usually several days, in my case, but it can be much longer or shorter - your mind changes.



You stop noticing opportunities, and become obsessed with risks.
Your ability to take decisions and come up with ideas withers and your imagination fails you. Your thoughts get stuck in loops. You feel weary doing the things you used to enjoy and angry around people you used to like.



In other words, your mind changes. Your memory, thinking and perceptions are all altered - but you don't notice that. You notice the effects, of course, but you think they're outside: you think the world has suddenly become less friendly. A classic case of confusion, in the worst sense.

Tuesday, August 2, 2011

The 30something Brain

Brain maturation continues for longer than previously thought - well up until age 30. That's according to two papers just out, which may be comforting for those lamenting the fact that they're nearing the big Three Oh.

This challenges the widespread view that maturation is essentially complete by the end of adolescence, in the early to mid 20s.

Petanjek et al show that the number of dendritic spines in the prefrontal cortex increases during childhood and then rapidly falls during puberty - which probably represents a kind of "pruning" process. That's nothing new, but they also found that the pruning doesn't stop when you hit 20. It continues, albeit gradually, up to 30 and beyond.

This study looked at post-mortem brain samples taken from people who died at various different ages. Lebel and Beaulieu used diffusion MRI to examine healthy living brains. They scanned 103 people and everyone got at least 2 scans a few year years apart, so they could look at changes over time.

They found that the fractional anisotropy (a measure of the "integrity") of different white matter tracts varies with age in a non-linear fashion. All tracts become stronger during childhood, and most peak at about 20. Then they start to weaken again. But not all of them - others, such as the cingulum, take longer to mature.

Also, total white matter volume continues rising well up to age 30.

Plus, there's a lot of individual variability. Some people's brains were still maturing well into their late 20s, even in white matter tracts that on average are mature by 20. Some of this will be noise in the data, but not all of it.

These results also fit nicely with this paper from last year that looked at functional connectivity of brain activity.

So, while most maturation does happen before and during adolescence, these results show that it's not a straightforward case of The Adolescent Brain turning suddenly into The Adult Brain when you hit 21, which point it solidifies into the final product,

ResearchBlogging.orgLebel C, & Beaulieu C (2011). Longitudinal development of human brain wiring continues from childhood into adulthood. The Journal of Neuroscience, 31 (30), 10937-47 PMID: 21795544

Petanjek, Z., Judas, M., Simic, G., Rasin, M., Uylings, H., Rakic, P., & Kostovic, I. (2011). Extraordinary neoteny of synaptic spines in the human prefrontal cortex Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.1105108108

The 30something Brain

Brain maturation continues for longer than previously thought - well up until age 30. That's according to two papers just out, which may be comforting for those lamenting the fact that they're nearing the big Three Oh.

This challenges the widespread view that maturation is essentially complete by the end of adolescence, in the early to mid 20s.

Petanjek et al show that the number of dendritic spines in the prefrontal cortex increases during childhood and then rapidly falls during puberty - which probably represents a kind of "pruning" process. That's nothing new, but they also found that the pruning doesn't stop when you hit 20. It continues, albeit gradually, up to 30 and beyond.

This study looked at post-mortem brain samples taken from people who died at various different ages. Lebel and Beaulieu used diffusion MRI to examine healthy living brains. They scanned 103 people and everyone got at least 2 scans a few year years apart, so they could look at changes over time.

They found that the fractional anisotropy (a measure of the "integrity") of different white matter tracts varies with age in a non-linear fashion. All tracts become stronger during childhood, and most peak at about 20. Then they start to weaken again. But not all of them - others, such as the cingulum, take longer to mature.

Also, total white matter volume continues rising well up to age 30.

Plus, there's a lot of individual variability. Some people's brains were still maturing well into their late 20s, even in white matter tracts that on average are mature by 20. Some of this will be noise in the data, but not all of it.

These results also fit nicely with this paper from last year that looked at functional connectivity of brain activity.

So, while most maturation does happen before and during adolescence, these results show that it's not a straightforward case of The Adolescent Brain turning suddenly into The Adult Brain when you hit 21, which point it solidifies into the final product,

ResearchBlogging.orgLebel C, & Beaulieu C (2011). Longitudinal development of human brain wiring continues from childhood into adulthood. The Journal of Neuroscience, 31 (30), 10937-47 PMID: 21795544

Petanjek, Z., Judas, M., Simic, G., Rasin, M., Uylings, H., Rakic, P., & Kostovic, I. (2011). Extraordinary neoteny of synaptic spines in the human prefrontal cortex Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.1105108108

Saturday, May 14, 2011

Filters

At TED, Eli Pariser, author of the The Filter Bubble, talks about how:
As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our world-view. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.
His point is that the web is, technologically, a fantastic system of giving the consumers of information (i.e. you) exactly what they want, when they want it. It's enabled a degree of personalization which old media could never come close to. But this isn't necessarily a good thing, because people tend to pick and choose information that fits with their existing views and interests, and filters out everything else.

The problem is not entirely new. Back in the days when everyone read their daily newspaper, the newspaper editor was your filter. And because there were maybe a dozen newspapers in your region that you could buy, you'd choose the one that best fitted with your world-view.


Indeed, in the UK, what newspaper you read says considerably more about you than what party you vote for. There are only 3 main political parties, but there are about 10 main newspapers, and in my experience people are more likely to change their vote than to change what they read.

But the internet allows people to cherry-pick far more effectively. The Guardian, for example, regularly prints articles that annoy, or at least challenge, many Guardian readers. That's inevitable, because no two people have exactly the same tastes: what one reader loves will have another reader tearing up his paper in frustration.

Nowadays, it's quite possible to get all of your news and views from blogs. Blogs are specialized: they cover a particular kind of stories, with a particular slant. Many of them do that extremely well. If you don't quite agree with a given blog, there's plenty of others with a slightly different approach to pick from. And you can pick as many blogs as you like until you've got a full set - exactly how you want it. Clearly, the potential to only find out about what you already want to hear is much greater.

New or not, it's certainly a problem. The good thing is that the internet makes it extremely easy to snap out of the filter bubble. A completely different perspective is just a click away: that's new, as well. All you need is to want to do that.

Why should you? Always reading stuff that you already agree with isn't the best way to get informed about something. Actually, it's just about the worst way to do that. If you're serious about wanting to learn the truth about something, you need to (critically) read different sources. But beyond that, it's just boring to always do the same things. There are a lot of cool things going on that you've never heard of.

Finally, if you're a blogger, remember that you're not just telling readers your opinions, you're helping them to filter out other people's. You don't have to feel bad about that, it's inevitable, but remember: if you really want to help your readers understand something, you need to tell them about the areas of disagreement.

I don't just mean linking to stupid people and then explaining why they're stupid. That's fun, but if you're serious, you need to link to the best examples of alternative views and give them a fair hearing. This is something that I feel I could do more of on this blog, and I hope to do it more in future.

Filters

At TED, Eli Pariser, author of the The Filter Bubble, talks about how:
As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our world-view. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.
His point is that the web is, technologically, a fantastic system of giving the consumers of information (i.e. you) exactly what they want, when they want it. It's enabled a degree of personalization which old media could never come close to. But this isn't necessarily a good thing, because people tend to pick and choose information that fits with their existing views and interests, and filters out everything else.

The problem is not entirely new. Back in the days when everyone read their daily newspaper, the newspaper editor was your filter. And because there were maybe a dozen newspapers in your region that you could buy, you'd choose the one that best fitted with your world-view.


Indeed, in the UK, what newspaper you read says considerably more about you than what party you vote for. There are only 3 main political parties, but there are about 10 main newspapers, and in my experience people are more likely to change their vote than to change what they read.

But the internet allows people to cherry-pick far more effectively. The Guardian, for example, regularly prints articles that annoy, or at least challenge, many Guardian readers. That's inevitable, because no two people have exactly the same tastes: what one reader loves will have another reader tearing up his paper in frustration.

Nowadays, it's quite possible to get all of your news and views from blogs. Blogs are specialized: they cover a particular kind of stories, with a particular slant. Many of them do that extremely well. If you don't quite agree with a given blog, there's plenty of others with a slightly different approach to pick from. And you can pick as many blogs as you like until you've got a full set - exactly how you want it. Clearly, the potential to only find out about what you already want to hear is much greater.

New or not, it's certainly a problem. The good thing is that the internet makes it extremely easy to snap out of the filter bubble. A completely different perspective is just a click away: that's new, as well. All you need is to want to do that.

Why should you? Always reading stuff that you already agree with isn't the best way to get informed about something. Actually, it's just about the worst way to do that. If you're serious about wanting to learn the truth about something, you need to (critically) read different sources. But beyond that, it's just boring to always do the same things. There are a lot of cool things going on that you've never heard of.

Finally, if you're a blogger, remember that you're not just telling readers your opinions, you're helping them to filter out other people's. You don't have to feel bad about that, it's inevitable, but remember: if you really want to help your readers understand something, you need to tell them about the areas of disagreement.

I don't just mean linking to stupid people and then explaining why they're stupid. That's fun, but if you're serious, you need to link to the best examples of alternative views and give them a fair hearing. This is something that I feel I could do more of on this blog, and I hope to do it more in future.

Wednesday, May 11, 2011

Duck or Rabbit?

Ambigous figures are drawings that seem to flip from being one thing to another.

Psychologists Melissa Allen and Alison Chambers recently showed these images to teenagers with autism in an attempt to find out whether they were able to perceive the effect normally: Implicit and explicit understanding of ambiguous figures by adolescents with autism spectrum disorder

A leading theory of autism is weak central coherence - the idea that autistic people tend to be focussed on details, rather than the "big picture". This might predict that autism would interfere with the perception of these figures because the ambiguity is all about the global, gestalt meaning: the details are fixed, but you can see them as adding up to two different things.

The autistic teens and a control group were showed the images and asked to copy them using a pen and paper. Then their drawings were rated for "duckness" or "rabbitness", or equivalent, by a rater who wasn't told which diagnosis the drawer had.

The results showed that the autistic group were able to perceive both interpretations of the figures, and were equally likely to report experiencing the "reversal" phenomena in which the image seems to flip. However, when it came to the drawings, they were less biased by being told which interpretation to use. When the instructions said "Draw this rabbit" as opposed to "Draw this picture", controls tended to make their copy more rabbity, but autistic people copied it faithfully.


Beyond their relevance to autism, these kinds of pictures are interesting because they tell us something important about perception.

You can't see these images for what they really are. They really are ambiguous - they're neither duck, nor rabbit. They're both. However, our brains insist that they are one the other, at any one time. They're duck, rabbit, duck, rabbit. But they never seem to be a "duckrabbit". Not for me anyway. Even though I know, in an abstract sense, that this is what they really are.

Both "duck" and "rabbit" are things we've encountered a thousand times before. So we seem to be drawn to see them in those familiar terms. "Duckrabbits" are unheard of, outside psychology. Rather than sit on the fence, our perceptions fall into the well-worn grooves of our preexisting categories.

ResearchBlogging.orgAllen ML, & Chambers A (2011). Implicit and explicit understanding of ambiguous figures by adolescents with autism spectrum disorder. Autism : the international journal of research and practice PMID: 21486897

Duck or Rabbit?

Ambigous figures are drawings that seem to flip from being one thing to another.

Psychologists Melissa Allen and Alison Chambers recently showed these images to teenagers with autism in an attempt to find out whether they were able to perceive the effect normally: Implicit and explicit understanding of ambiguous figures by adolescents with autism spectrum disorder

A leading theory of autism is weak central coherence - the idea that autistic people tend to be focussed on details, rather than the "big picture". This might predict that autism would interfere with the perception of these figures because the ambiguity is all about the global, gestalt meaning: the details are fixed, but you can see them as adding up to two different things.

The autistic teens and a control group were showed the images and asked to copy them using a pen and paper. Then their drawings were rated for "duckness" or "rabbitness", or equivalent, by a rater who wasn't told which diagnosis the drawer had.

The results showed that the autistic group were able to perceive both interpretations of the figures, and were equally likely to report experiencing the "reversal" phenomena in which the image seems to flip. However, when it came to the drawings, they were less biased by being told which interpretation to use. When the instructions said "Draw this rabbit" as opposed to "Draw this picture", controls tended to make their copy more rabbity, but autistic people copied it faithfully.


Beyond their relevance to autism, these kinds of pictures are interesting because they tell us something important about perception.

You can't see these images for what they really are. They really are ambiguous - they're neither duck, nor rabbit. They're both. However, our brains insist that they are one the other, at any one time. They're duck, rabbit, duck, rabbit. But they never seem to be a "duckrabbit". Not for me anyway. Even though I know, in an abstract sense, that this is what they really are.

Both "duck" and "rabbit" are things we've encountered a thousand times before. So we seem to be drawn to see them in those familiar terms. "Duckrabbits" are unheard of, outside psychology. Rather than sit on the fence, our perceptions fall into the well-worn grooves of our preexisting categories.

ResearchBlogging.orgAllen ML, & Chambers A (2011). Implicit and explicit understanding of ambiguous figures by adolescents with autism spectrum disorder. Autism : the international journal of research and practice PMID: 21486897

Monday, March 21, 2011

Turn That Off, I'm Writing

I listen to a lot of music.

Music is playing in the background most of the time whether I'm on the computer or not (thanks to my old, poorly-designed but still faithful Shuffle). However I've noticed that I find myself turning off iTunes when I'm writing.

Right now, for example, I have just put this song on pause because I'm writing this post (a post about why I paused that song - bit of a chicken-and-egg situation there.) I can't write with a song on, because the lyrics would be distracting.

However, I don't always do this when I'm typing. With some songs, and some kinds of writing, it's OK. I think this is how it works:

Instrumental songs are obviously OK. But I don't listen to many.

More interestingly, songs I've listened to many times are fine. I've just put on this which, according to iTunes, I have listened to no fewer than 140 times over the last three years. And this is fine. No distraction. I think the reason must be that I'm so used to the lyrics that the language part of my mind no longer needs to work out what they mean.

Some kinds of writing are compatible with songs. Blogging isn't and writing "important" emails aren't but a lot of emails are. Which I guess means that I'm not really putting much effort into writing them. I must be typing on auto-pilot, just repeating stock phrases ("Sounds good") rather than actually using my language areas, or at least, not using them very hard.

Psychologists are fond of using these kinds of selective distraction tasks to map out the architecture of the mind e.g. verbal ones distract verbal working memory but not spatial, and vice versa. So this is all pretty standard stuff, but what's interesting is that it's not intuitively obvious.

It doesn't feel like sometimes when I'm writing my language faculty is hard at work, and other times it's not. It feels like I'm thinking about what I type all the time. It's just typing. Sometimes I'll be replying to a bunch of emails, music on full blast, and then I'll find myself putting it on pause when I get to one particular email; but I couldn't tell you in advance which one it would be. It just feels right. Better turn the music off for this one, this one's serious - though even that's putting it too strongly. It doesn't feel serious, it just feels like the music needs to be off.

Our concious experience is smooth and seamless even though we're constantly switching between using different parts of our brains. This becomes all too evident in the case of brain lesions, which can rob us of capacities we never knew we had, because they were always there when we needed them. Some lesions, for example, render you completely unaware of anything that happens to your left. It doesn't seem like we're using a different part of our brain when dealing with stuff on the left as opposed to the right - but we are.

Turn That Off, I'm Writing

I listen to a lot of music.

Music is playing in the background most of the time whether I'm on the computer or not (thanks to my old, poorly-designed but still faithful Shuffle). However I've noticed that I find myself turning off iTunes when I'm writing.

Right now, for example, I have just put this song on pause because I'm writing this post (a post about why I paused that song - bit of a chicken-and-egg situation there.) I can't write with a song on, because the lyrics would be distracting.

However, I don't always do this when I'm typing. With some songs, and some kinds of writing, it's OK. I think this is how it works:

Instrumental songs are obviously OK. But I don't listen to many.

More interestingly, songs I've listened to many times are fine. I've just put on this which, according to iTunes, I have listened to no fewer than 140 times over the last three years. And this is fine. No distraction. I think the reason must be that I'm so used to the lyrics that the language part of my mind no longer needs to work out what they mean.

Some kinds of writing are compatible with songs. Blogging isn't and writing "important" emails aren't but a lot of emails are. Which I guess means that I'm not really putting much effort into writing them. I must be typing on auto-pilot, just repeating stock phrases ("Sounds good") rather than actually using my language areas, or at least, not using them very hard.

Psychologists are fond of using these kinds of selective distraction tasks to map out the architecture of the mind e.g. verbal ones distract verbal working memory but not spatial, and vice versa. So this is all pretty standard stuff, but what's interesting is that it's not intuitively obvious.

It doesn't feel like sometimes when I'm writing my language faculty is hard at work, and other times it's not. It feels like I'm thinking about what I type all the time. It's just typing. Sometimes I'll be replying to a bunch of emails, music on full blast, and then I'll find myself putting it on pause when I get to one particular email; but I couldn't tell you in advance which one it would be. It just feels right. Better turn the music off for this one, this one's serious - though even that's putting it too strongly. It doesn't feel serious, it just feels like the music needs to be off.

Our concious experience is smooth and seamless even though we're constantly switching between using different parts of our brains. This becomes all too evident in the case of brain lesions, which can rob us of capacities we never knew we had, because they were always there when we needed them. Some lesions, for example, render you completely unaware of anything that happens to your left. It doesn't seem like we're using a different part of our brain when dealing with stuff on the left as opposed to the right - but we are.

Saturday, February 19, 2011

The Web of Morgellons

A fascinating new paper: Morgellons Disease, or Antipsychotic-Responsive Delusional Parasitosis, in an HIV Patient: Beliefs in The Age of the Internet

“Mr. A” was a 43-year-old man...His most pressing medical complaint was worrisome fatigue. He was not depressed...had no formal psychiatric history, no family psychiatric history, and he was a successful businessman.

He was referred to the psychiatry department by his primary-care physician (PCP) because of a 2-year-long complaint of pruritus [itching] accompanied by the belief of being infested with parasites. Numerous visits to the infectious disease clinic and an extensive medical work-up...had not uncovered any medical disorder, to the patient’s great frustration.

Although no parasites were ever trapped, Mr. A caused skin damage by probing for them and by applying topical solutions such as hydrogen peroxide to “bring them to the surface.” After reading about Morgellons disease on the Internet, he “recalled” extruding particles from his skin, including “dirt” and “fuzz.”

During the initial consultation visit with the psychiatrist, Mr. A was apprehensive but cautiously optimistic that a medication could help. The psychiatrist had been forewarned by the PCP that the patient had discovered a website describing Morgellons and “latched onto” this diagnosis.

However, it was notable that the patient allowed the possibility (“30%”) that he was suffering from delusions (and not Morgellons), mostly because he trusted his PCP, “who has taken very good care of me for many years.”

The patient agreed to a risperidone [an antipsychotic] trial of up to 2 mg per day. [i.e. a lowish dose]. Within weeks, his preoccupation with being infested lessened significantly... Although not 100% convinced that he might not have Morgellons disease, he is no longer pruritic and is no longer damaging his skin or trying to trap insects. He remains greatly improved 1 year later.
(Mr A. had also been HIV+ for 20 years, but he still had good immune function and the HIV may have had nothing to do with the case.)

"Morgellons" is, according to people who say they suffer from it, a mysterious disease characterised by the feeling of parasites or insects moving underneath the skin, accompanied by skin lesions out of which emerge strange, brightly-coloured fibres or threads. Other symptoms include fatigue, aches and pains, and difficulty concentrating.

According to almost all doctors, there are no parasites, the lesions are caused by the patient's own scratching or attempts to dig out the non-existent critters, and the fibres come from clothes, carpets, or other textiles which the patient has somehow inserted into their own skin. It may seem unbelievable that someone could do this "unconsciously", but stranger things have happened.

As the authors of this paper, Freudenreich et al, say, Morgellons is a disease of the internet age. It was "discovered" in 2002 by a Mary Leitao, with Patient Zero being her own 2 year old son. Since then its fame, and the reported number of cases, has grown steadily - especially in California.

Delusional parasitosis is the opposite of Morgellons: doctors believe in it, but the people who have it, don't. It's seen in some mental disorders and is also quite common in abusers of certain drugs like methamphetamine. It feels like there are bugs beneath your skin. There aren't, but the belief that there are is very powerful.

This then is the raw material in most cases; what the concept of "Morgellons" adds is a theory, a social context and a set of expectations that helps make sense of the otherwise baffling symptoms. And as we know expectations, whether positive or negative, tend to be become experiences. The diagnosis doesn't create the symptoms out of nowhere but rather takes them and reshapes them into a coherent pattern.

As Freudenreich et al note, doctors may be tempted to argue with the patient - you don't have Morgellons, there's no such thing, it's absurd - but the whole point is that mainstream medicine couldn't explain the symptoms, which is why the patient turned to less orthodox ideas.

Remember the extensive tests that came up negative "to the patient’s great frustration." And remember that "delusional parasitosis" is not an explanation, just a description, of the symptoms. To diagnose someone with that is saying "We've no idea why but you've imagined this". True, maybe, but not very palatable.

Rather, they say, doctors should just suggest that maybe there's something else going on, and should prescribe a treatment on that basis. Not rejecting the patient's beliefs but saying, maybe you're right, but in my experience this treatment makes people with your condition feel better, and that's why you're here, right?

Whether the pills worked purely as a placebo or whether there was a direct pharmacological effect, we'll never know. Probably it was a bit of both. It's not clear that it's important, really. The patient improved, and it's unlikely that it would have worked as well if they'd been given in a negative atmosphere of coercion or rejection - if indeed he'd agreed to take them at all.

Morgellons is a classic case of a disease that consists of an underlying experience filtered through the lens of a socially-transmitted interpretation. But every disease is that, to a degree. Even the most rigorously "medical" conditions like cancer also come with a set of expectations and a social meaning; psychiatric disorders certainly do.

I guess Morgellons is too new to be a textbook case yet - but it should be. Everyone with an interest in the mind, everyone who treats diseases, and everyone who's ever been ill - everyone really - ought to be familiar with it because while it's an extreme case, it's not unique. "All life is here" in those tangled little fibres.

ResearchBlogging.orgFreudenreich O, Kontos N, Tranulis C, & Cather C (2010). Morgellons disease, or antipsychotic-responsive delusional parasitosis, in an hiv patient: beliefs in the age of the internet. Psychosomatics, 51 (6), 453-7 PMID: 21051675

The Web of Morgellons

A fascinating new paper: Morgellons Disease, or Antipsychotic-Responsive Delusional Parasitosis, in an HIV Patient: Beliefs in The Age of the Internet

“Mr. A” was a 43-year-old man...His most pressing medical complaint was worrisome fatigue. He was not depressed...had no formal psychiatric history, no family psychiatric history, and he was a successful businessman.

He was referred to the psychiatry department by his primary-care physician (PCP) because of a 2-year-long complaint of pruritus [itching] accompanied by the belief of being infested with parasites. Numerous visits to the infectious disease clinic and an extensive medical work-up...had not uncovered any medical disorder, to the patient’s great frustration.

Although no parasites were ever trapped, Mr. A caused skin damage by probing for them and by applying topical solutions such as hydrogen peroxide to “bring them to the surface.” After reading about Morgellons disease on the Internet, he “recalled” extruding particles from his skin, including “dirt” and “fuzz.”

During the initial consultation visit with the psychiatrist, Mr. A was apprehensive but cautiously optimistic that a medication could help. The psychiatrist had been forewarned by the PCP that the patient had discovered a website describing Morgellons and “latched onto” this diagnosis.

However, it was notable that the patient allowed the possibility (“30%”) that he was suffering from delusions (and not Morgellons), mostly because he trusted his PCP, “who has taken very good care of me for many years.”

The patient agreed to a risperidone [an antipsychotic] trial of up to 2 mg per day. [i.e. a lowish dose]. Within weeks, his preoccupation with being infested lessened significantly... Although not 100% convinced that he might not have Morgellons disease, he is no longer pruritic and is no longer damaging his skin or trying to trap insects. He remains greatly improved 1 year later.
(Mr A. had also been HIV+ for 20 years, but he still had good immune function and the HIV may have had nothing to do with the case.)

"Morgellons" is, according to people who say they suffer from it, a mysterious disease characterised by the feeling of parasites or insects moving underneath the skin, accompanied by skin lesions out of which emerge strange, brightly-coloured fibres or threads. Other symptoms include fatigue, aches and pains, and difficulty concentrating.

According to almost all doctors, there are no parasites, the lesions are caused by the patient's own scratching or attempts to dig out the non-existent critters, and the fibres come from clothes, carpets, or other textiles which the patient has somehow inserted into their own skin. It may seem unbelievable that someone could do this "unconsciously", but stranger things have happened.

As the authors of this paper, Freudenreich et al, say, Morgellons is a disease of the internet age. It was "discovered" in 2002 by a Mary Leitao, with Patient Zero being her own 2 year old son. Since then its fame, and the reported number of cases, has grown steadily - especially in California.

Delusional parasitosis is the opposite of Morgellons: doctors believe in it, but the people who have it, don't. It's seen in some mental disorders and is also quite common in abusers of certain drugs like methamphetamine. It feels like there are bugs beneath your skin. There aren't, but the belief that there are is very powerful.

This then is the raw material in most cases; what the concept of "Morgellons" adds is a theory, a social context and a set of expectations that helps make sense of the otherwise baffling symptoms. And as we know expectations, whether positive or negative, tend to be become experiences. The diagnosis doesn't create the symptoms out of nowhere but rather takes them and reshapes them into a coherent pattern.

As Freudenreich et al note, doctors may be tempted to argue with the patient - you don't have Morgellons, there's no such thing, it's absurd - but the whole point is that mainstream medicine couldn't explain the symptoms, which is why the patient turned to less orthodox ideas.

Remember the extensive tests that came up negative "to the patient’s great frustration." And remember that "delusional parasitosis" is not an explanation, just a description, of the symptoms. To diagnose someone with that is saying "We've no idea why but you've imagined this". True, maybe, but not very palatable.

Rather, they say, doctors should just suggest that maybe there's something else going on, and should prescribe a treatment on that basis. Not rejecting the patient's beliefs but saying, maybe you're right, but in my experience this treatment makes people with your condition feel better, and that's why you're here, right?

Whether the pills worked purely as a placebo or whether there was a direct pharmacological effect, we'll never know. Probably it was a bit of both. It's not clear that it's important, really. The patient improved, and it's unlikely that it would have worked as well if they'd been given in a negative atmosphere of coercion or rejection - if indeed he'd agreed to take them at all.

Morgellons is a classic case of a disease that consists of an underlying experience filtered through the lens of a socially-transmitted interpretation. But every disease is that, to a degree. Even the most rigorously "medical" conditions like cancer also come with a set of expectations and a social meaning; psychiatric disorders certainly do.

I guess Morgellons is too new to be a textbook case yet - but it should be. Everyone with an interest in the mind, everyone who treats diseases, and everyone who's ever been ill - everyone really - ought to be familiar with it because while it's an extreme case, it's not unique. "All life is here" in those tangled little fibres.

ResearchBlogging.orgFreudenreich O, Kontos N, Tranulis C, & Cather C (2010). Morgellons disease, or antipsychotic-responsive delusional parasitosis, in an hiv patient: beliefs in the age of the internet. Psychosomatics, 51 (6), 453-7 PMID: 21051675

Thursday, February 17, 2011

WMDs vs MDD

Weapons of Mass Destruction. Nuclear, chemical and biological weapons. They're really nasty, right?

Well, some of them are. Nuclear weapons are Very Destructive Indeed. Even a tiny one, detonated in the middle of a major city, would probably kill hundreds of thousands. A medium-sized nuke could kill millions. The biggest would wipe a small country off the map in one go.

Chemical and biological weapons, on the other hand, while hardly nice, are just not on the same scale.

Sure, there are nightmare scenarios - a genetically engineered supervirus that kills a billion people - but they're hypothetical. If someone does design such a virus, then we can worry. As it is, biological weapons have never proven very useful. The 2001 US anthrax letters killed 5 people. Jared Loughner killed 6 with a gun he bought from a chain store.

Chemical weapons are little better. They were used heavily in WW1 and the Iran-Iraq War against military targets and killed many but never achieved a decisive victory, and the vast majority of deaths in these wars were caused by plain old bullets and bombs. Iraq's use of chemical weapons against Kurds in Halabja killed perhaps 5,000 - but this was a full-scale assault by an advanced air force, lasting several hours, on a defenceless population.

When a state-of-the-art nerve agent was used in the Tokyo subway attack, after much preparation by the cult responsible, who had professional chemists and advanced labs, 13 people died. In London on the 7th July 2005, terrorists killed 52 people with explosives made from haircare products.

Nuclear weapons aside, the best way to cause mass destruction is just to make an explosion, the bigger the better; yet conventional explosives, no matter how big, are not "WMDs", while chemical and biological weapons are.

So it seems to me that the term and the concept of "WMDs" is fundamentally unhelpful. It lumps together the apocalyptically powerful with the much less destructive. If you have to discuss everything except guns and explosives in one category, terms like "Unconventional weapons" are better as they avoid the misleading implication that all of these weapons are very, and equivalently, deadly; but grouping them together at all is risky.

That's WMDs. But there are plenty of other unhelpful concepts out there, some of which I've discussed previously. Take the concept of "major depressive disorder", for example. At least as the term is currently used, it lumps together extremely serious cases requiring hospitalization with mild "symptoms" which 40% of people experience by age 32.

WMDs vs MDD

Weapons of Mass Destruction. Nuclear, chemical and biological weapons. They're really nasty, right?

Well, some of them are. Nuclear weapons are Very Destructive Indeed. Even a tiny one, detonated in the middle of a major city, would probably kill hundreds of thousands. A medium-sized nuke could kill millions. The biggest would wipe a small country off the map in one go.

Chemical and biological weapons, on the other hand, while hardly nice, are just not on the same scale.

Sure, there are nightmare scenarios - a genetically engineered supervirus that kills a billion people - but they're hypothetical. If someone does design such a virus, then we can worry. As it is, biological weapons have never proven very useful. The 2001 US anthrax letters killed 5 people. Jared Loughner killed 6 with a gun he bought from a chain store.

Chemical weapons are little better. They were used heavily in WW1 and the Iran-Iraq War against military targets and killed many but never achieved a decisive victory, and the vast majority of deaths in these wars were caused by plain old bullets and bombs. Iraq's use of chemical weapons against Kurds in Halabja killed perhaps 5,000 - but this was a full-scale assault by an advanced air force, lasting several hours, on a defenceless population.

When a state-of-the-art nerve agent was used in the Tokyo subway attack, after much preparation by the cult responsible, who had professional chemists and advanced labs, 13 people died. In London on the 7th July 2005, terrorists killed 52 people with explosives made from haircare products.

Nuclear weapons aside, the best way to cause mass destruction is just to make an explosion, the bigger the better; yet conventional explosives, no matter how big, are not "WMDs", while chemical and biological weapons are.

So it seems to me that the term and the concept of "WMDs" is fundamentally unhelpful. It lumps together the apocalyptically powerful with the much less destructive. If you have to discuss everything except guns and explosives in one category, terms like "Unconventional weapons" are better as they avoid the misleading implication that all of these weapons are very, and equivalently, deadly; but grouping them together at all is risky.

That's WMDs. But there are plenty of other unhelpful concepts out there, some of which I've discussed previously. Take the concept of "major depressive disorder", for example. At least as the term is currently used, it lumps together extremely serious cases requiring hospitalization with mild "symptoms" which 40% of people experience by age 32.

Sunday, February 6, 2011

Did My Genes Make Me Do It?

A curious legal case from New York raises some interesting issues:
Court Rejects Judge’s Assertion of a Child Pornography Gene

According to the NYT:
A federal appeals court in Manhattan overturned a 6.5 year sentence in a child pornography case on Friday, saying the judge who imposed it improperly found that the defendant would return to viewing child pornography "because of an as-of-yet undiscovered gene."

The judge, Gary L. Sharpe, was quoted as saying, "It is a gene you were born with. And it’s not a gene you can get rid of," before he sentenced the defendant...

A three-judge panel of the United States Court of Appeals for the Second Circuit said in ruling on the defendant’s appeal, "It would be impermissible for the court to base its decision of recidivism on its unsupported theory of genetics."
Now I think we can all agree that judges shouldn't be handing down sentences on the basis of entirely hypothetical genes. However, things becomes a bit less clear if we imagine that the defendant did have a verified genetic abnormality. What then?

As chance would have it, this has just happened in Britain. On Thursday, former delivery driver Alan Potsbury, or as he was known to his colleagues, "Al The Paedo", was convicted of... well, the obvious.

Anyway, Potsbury has Klinefelter's Syndrome, aka XXY syndrome. Normally, women have two X chromosomes, while men have an X and a Y chromosome. People with Klinefelter's have three sex chromosomes, two X and a Y. They're male, but can experience various symptoms as a result of their extra X, although these are often pretty subtle, and the condition often goes undiagnosed.

Now I have no idea whether Potsbury's responsibility for his crime is lessened by the fact that he had a genetic disorder. And I certainly don't want to suggest that Klinefelter's "makes people into paedophiles", not least because in the vast majority of cases, it doesn't.

However, let's assume just for the sake of argument, that in this particular case he wouldn't have done what he did if it weren't for his extra chromosome. Or let's consider any hypothetical case where someone committed a crime "because of" a certain gene. Does this mean, as Judge Sharpe was suggesting, that it means their behaviour will be unlikely to change, and hence that heavy sentences are justified since rehabilitation won't work?

No. The fact that someone's past behaviour was associated with a gene doesn't tell us anything about how easy it would be to change it.

Being a Christian as opposed to a Muslim is, as far as we know, nothing to do with genetics; it's purely a matter of how you were brought up. Yet it's incredibly difficult to change. Many Christians and many Muslims spend their lives trying to make the heathens adopt the true faith and yet the number of successful conversions either way is tiny.

Hair colour, on the other hand, is entirely genetic. Yet it's easy to change. Just buy some bleach and some dye and you can have whatever hair you like. Or if you don't want hair at all, shave it off. You can't change your hair-colour genes, but you can make them irrelevant.

Back to Potsbury, even if we did accept that his paedophilia was in some way a result of his Klinefelter's, that wouldn't mean he was doomed to reoffend. Some behaviours are harder to change than others. Some are more genetic than others. But we can't assume that the one implies the other.

Did My Genes Make Me Do It?

A curious legal case from New York raises some interesting issues:
Court Rejects Judge’s Assertion of a Child Pornography Gene

According to the NYT:
A federal appeals court in Manhattan overturned a 6.5 year sentence in a child pornography case on Friday, saying the judge who imposed it improperly found that the defendant would return to viewing child pornography "because of an as-of-yet undiscovered gene."

The judge, Gary L. Sharpe, was quoted as saying, "It is a gene you were born with. And it’s not a gene you can get rid of," before he sentenced the defendant...

A three-judge panel of the United States Court of Appeals for the Second Circuit said in ruling on the defendant’s appeal, "It would be impermissible for the court to base its decision of recidivism on its unsupported theory of genetics."
Now I think we can all agree that judges shouldn't be handing down sentences on the basis of entirely hypothetical genes. However, things becomes a bit less clear if we imagine that the defendant did have a verified genetic abnormality. What then?

As chance would have it, this has just happened in Britain. On Thursday, former delivery driver Alan Potsbury, or as he was known to his colleagues, "Al The Paedo", was convicted of... well, the obvious.

Anyway, Potsbury has Klinefelter's Syndrome, aka XXY syndrome. Normally, women have two X chromosomes, while men have an X and a Y chromosome. People with Klinefelter's have three sex chromosomes, two X and a Y. They're male, but can experience various symptoms as a result of their extra X, although these are often pretty subtle, and the condition often goes undiagnosed.

Now I have no idea whether Potsbury's responsibility for his crime is lessened by the fact that he had a genetic disorder. And I certainly don't want to suggest that Klinefelter's "makes people into paedophiles", not least because in the vast majority of cases, it doesn't.

However, let's assume just for the sake of argument, that in this particular case he wouldn't have done what he did if it weren't for his extra chromosome. Or let's consider any hypothetical case where someone committed a crime "because of" a certain gene. Does this mean, as Judge Sharpe was suggesting, that it means their behaviour will be unlikely to change, and hence that heavy sentences are justified since rehabilitation won't work?

No. The fact that someone's past behaviour was associated with a gene doesn't tell us anything about how easy it would be to change it.

Being a Christian as opposed to a Muslim is, as far as we know, nothing to do with genetics; it's purely a matter of how you were brought up. Yet it's incredibly difficult to change. Many Christians and many Muslims spend their lives trying to make the heathens adopt the true faith and yet the number of successful conversions either way is tiny.

Hair colour, on the other hand, is entirely genetic. Yet it's easy to change. Just buy some bleach and some dye and you can have whatever hair you like. Or if you don't want hair at all, shave it off. You can't change your hair-colour genes, but you can make them irrelevant.

Back to Potsbury, even if we did accept that his paedophilia was in some way a result of his Klinefelter's, that wouldn't mean he was doomed to reoffend. Some behaviours are harder to change than others. Some are more genetic than others. But we can't assume that the one implies the other.

Sunday, December 5, 2010

Online Comments: It's Not You, It's Them

Last week I was at a discussion about New Media, and someone mentioned that they'd been put off from writing content online because of a comment on one of their articles accusing them of being "stupid".

I found this surprising - not the comment, but that anyone would take it so personally. It's the internet. You will get called names. Everyone does. It doesn't mean there's anything wrong with you.

I suspect this is a generational issue. People who 'grew up online' know, as Penny Arcade explained, that

The sad fact is that there are millions of people whose idea of fun is to find people they disagree with, and mock them. And they're right, it can be fun - why else do you think people like Jon Stewart are so popular? - but that's all it is, entertainment. If you're on the receiving end, don't take it seriously.

If you write something online, and a lot of people read it, you will get slammed. Someone, somewhere, will disagree with you and they'll tell you so, in no uncertain terms. This is true whatever you write about, but some topics are like a big red rag to the herds of bulls out there.

Just to name a few, if you say anything vaguely related to climate change, religion, health, the economy, feminism or race, you might as well be holding a placard with a big arrow pointing down at you and "Sling Mud Here" on it.

The point is - it's them, not you. They are not interested in you, they don't know you, it's not you. True, they might tailor their insults a bit; if you're a young woman you might be, say, a "stupid girl" where a man would merely get called an "idiot". But this doesn't mean that the attacks are a reflection on you in any way. You just happen to be the one in the line of fire.

What do you do about this? Nothing.

Trying to enter into a serious debate is pointless. Insulting them back can be fun, just remember that if you find it fun, you've become one of them: "he who stares too long into the abyss...", etc. Complaining to the moderators might help, but unless the site has a rock solid zero-tolerance-for-fuckwads policy, probably not. Where the blight has taken root, like Comment is Free, I'd not waste your time complaining. Just ignore it and carry on.

The most important thing is not to take it personally. Do not get offended. Do not care. Because no-one else cares. Especially the people who wrote the comments. They presumably care about whatever "issue" prompted their attack, but they don't care about you. If anything, you should be pleased, because on the internet, the only stuff that doesn't attract stupid comments is the stuff that no-one reads.

I've heard these attacks referred to as "policing" existing hierarchies or "silencing" certain types of people. This seems to me to be granting them far more respect than they deserve. With the actual police, if you break the rules, they will physically arrest you. They have power. Internet trolls don't: if they succeed in policing or silencing anybody, it's because their targets let them boss them around. They're nobody; they're not your problem.

If you can't help being offended by such comments, don't read them, but ideally you shouldn't need to resort to that. For one thing, it means you miss the sensible comments (and there's always a few). But fundamentally, you shouldn't need to do this, because you really shouldn't care what some anonymous joker from the depths of the internet thinks about you.

Online Comments: It's Not You, It's Them

Last week I was at a discussion about New Media, and someone mentioned that they'd been put off from writing content online because of a comment on one of their articles accusing them of being "stupid".

I found this surprising - not the comment, but that anyone would take it so personally. It's the internet. You will get called names. Everyone does. It doesn't mean there's anything wrong with you.

I suspect this is a generational issue. People who 'grew up online' know, as Penny Arcade explained, that

The sad fact is that there are millions of people whose idea of fun is to find people they disagree with, and mock them. And they're right, it can be fun - why else do you think people like Jon Stewart are so popular? - but that's all it is, entertainment. If you're on the receiving end, don't take it seriously.

If you write something online, and a lot of people read it, you will get slammed. Someone, somewhere, will disagree with you and they'll tell you so, in no uncertain terms. This is true whatever you write about, but some topics are like a big red rag to the herds of bulls out there.

Just to name a few, if you say anything vaguely related to climate change, religion, health, the economy, feminism or race, you might as well be holding a placard with a big arrow pointing down at you and "Sling Mud Here" on it.

The point is - it's them, not you. They are not interested in you, they don't know you, it's not you. True, they might tailor their insults a bit; if you're a young woman you might be, say, a "stupid girl" where a man would merely get called an "idiot". But this doesn't mean that the attacks are a reflection on you in any way. You just happen to be the one in the line of fire.

What do you do about this? Nothing.

Trying to enter into a serious debate is pointless. Insulting them back can be fun, just remember that if you find it fun, you've become one of them: "he who stares too long into the abyss...", etc. Complaining to the moderators might help, but unless the site has a rock solid zero-tolerance-for-fuckwads policy, probably not. Where the blight has taken root, like Comment is Free, I'd not waste your time complaining. Just ignore it and carry on.

The most important thing is not to take it personally. Do not get offended. Do not care. Because no-one else cares. Especially the people who wrote the comments. They presumably care about whatever "issue" prompted their attack, but they don't care about you. If anything, you should be pleased, because on the internet, the only stuff that doesn't attract stupid comments is the stuff that no-one reads.

I've heard these attacks referred to as "policing" existing hierarchies or "silencing" certain types of people. This seems to me to be granting them far more respect than they deserve. With the actual police, if you break the rules, they will physically arrest you. They have power. Internet trolls don't: if they succeed in policing or silencing anybody, it's because their targets let them boss them around. They're nobody; they're not your problem.

If you can't help being offended by such comments, don't read them, but ideally you shouldn't need to resort to that. For one thing, it means you miss the sensible comments (and there's always a few). But fundamentally, you shouldn't need to do this, because you really shouldn't care what some anonymous joker from the depths of the internet thinks about you.