Prosopagnosia is "An inability or difficulty in recognizing familiar faces; it may be congenital or result from injury or disease of the brain." I've talked about thisbefore. Well, Jake"The Superficial"Young now has a follow up post on the paper which elicited my initial skepticism. Since you can read the abstract, here is the conclusion:
Congenital PA is the only known monogenic dysfunction of a higher cognitive visual skill. Among more than 90 different cognitive functions (e.g., musical mind, absolute pitch) and dysfunctions (e.g., agraphia, dyscalculia, dyslexia) related to specific cognitive behavioral and neurological disorders we could only find a few monosymptomatic conditions in the OMIM database (http://www.ncbi.nlm.nih.gov/Omim/) with proven or suggested heredity: perfect musical pitch, syn. absolute pitch...; specific language impairment (SLI)...; specific dyslexia, syn. word-blindness...familial developmental dysphasia...; tune deafness; syn. dysmelodia..; congenital anosmia; syn. odor blindness...; inability to smell musk...; hereditary whispering dysphonia...; indifference to pain, syn. congenital analgesia.... Gene mapping was successful, in one disorder..., a single but large family in which half of the members had orofacial dyspraxia and severe speech and language impairment. A point mutation was found in the FOXP2 (forkhead box protein 2) gene co-segregating with the disorder.... There are several susceptibility genes described in the heterogeneous group of dyslexia with regular segregation. The resulting phenotypes do not represent this in any case.
Here is the take home message, around 2-3% of this sample lack a "normal" human cognitive ability, and this is due to an autosomal dominant mutation. First, this is strange because mental abilities are rather complex, and it seems that simple Mendelian inheritance would be unlikely to crop up except in extremely rare pathologies which "break" essential functionality in via pathways. It think that it is interesting that this hereditary condition is sharply demarcated not only from other cognitive traits, but, from other aspects of visual peception of the face (expression). This is suggestive (to me) of a hypermodularity that probably would take most people by surprise. Human ability to recognize faces is a gestalt competence that we possess, such as language. It is not part of our general intelligence, it is a specialized ability we as humans evolved to recognize conspecifics with a level of fluidity and rapidity which would no doubt facilitate maximal utility out of social interactions. Other species have their own conspecific recognition capabilities, this isn't rocket science. The lack of this in a non-trivial number of humans is suspicious. I think it would be analogous to (even though the functionality is an order of magnitude less crucial) 1 out of 50 humans lacking the ability to speak. A researcher who is involved in this work states:
Their problem is in recognizing faces, particularly faces out of context. This can occur frequently in modern urban society with its crowds, great mobility plus the surfeit of facial images. It probably occurred less often in traditional societies, thus reducing selection pressure.
What is a "traditional society"? I think there is considerable evidence that a convential pre-modern circle of friendship, familial relation and aquaintanceship extended to no more than 200 individuals, and more often one would be habituat to only a few dozen at most (the "band" as opposed to the "clan," and above that, the "tribe"). Clearly a facial recognition deficit would not spell death in such a society, and the awkwardness would be mitigated by the familiarity with everyone around you. But if you read the paper you will note that these individuals tend to not understand why you have to look at people in the face when you talk to them. This, and other problems which emerge out of this deficit suggest to me that even in pre-modern societies the
reproducive fitness of those who possess this allele should be lower than those who do not possess this allele
. Mutation-selection balance for a dominant allele in a diploid population is: Frequency = (mutation rate)/(selection coefficient) The selection coefficient is presumably negative. If you assume it is 0.1, that is, someone who carries this allele (and because of dominance exhibits the trait) has 90% of the reproductive output, on average, as someone who does not. Since the population sampled exhibited a frequency of (about) 2%, 0.02 X 0.1 = 0.02. If the selection coefficient is .01, you still have a mutation rate of 0.002 on the locus. The mutation rate in humans is actually much lower than this, on the order of 10^-7 to 10^-8. Even for putative "polygenes" the mutation rate is on the order of 10^-3 to 10^-5. You see where I'm going with this? It could be that this locus, for whatever reason, is hyperactive in terms of mutations. But, I suspect that a better explanation is that
at some point there was selection for this allele because of a pleiotropic side effect
. Since this is an autosomal dominant it is not unlikely that there is a trans-acting factor which is interfering with the other copy of the allele which controls this trait. Additionally, this novel variant might have a role in various biochemical and molecula genetic pathways which may be selectively beneficial. Anyway, there are many ways you can go with this. But, the study was conducted in Germany. I think a key finding will be that the frequency of this condition varies across populations, because I would not be surprised if the selective event that might have occurred in Northern Europe did not occur elsewhere.