If you look at the fulltext, this is for a mutation analyzed via the old standard Mendelian techniques, not a brute-force SNP analysis, as your link refers to:
Fortunately, the process of Mendelian inheritance allows a further robust test of the causality of the Arg844His mutation to be performed, avoiding the usual problems of confounding from other variables that might aggregate in a familial fashion but might be unlinked to the mutation. To test the relevance of the mutation, affected and unaffected first‐degree sibs in each generation (II:2, II:6, II:8 vs II:4; III:5 vs III:7; IV:3 vs IV:4, IV:5) underwent identical cognitive testing (table 11).). The mean age of affected members (53.8 years) was higher than that of unaffected members (41.1 years), acting conservatively against any superior cognitive performance in the affected group, given the known effect of age.^21 We tested the significance of the association between affected status and every measured cognitive phenotype following a randomisation approach that accounts for the kindred structure.^22...The single score for the overall difference in all mean phenotype values between the affected and non‐affected subjects was significant (p = 0.006). For individual phenotypes, all except those for immediate story recall, delayed story recall and semantic fluency had p values <5%: vocabulary (p = 0.014), digit span (p = 0.020), similarities (p = 0.039), phonemic fluency (p = 0.050), cognitive estimates (p = 0.012), immediate story recall (p = 0.112), delayed story recall (p = 0.087), immediate verbal learning (p = 0.048), delayed verbal learning (p = 0.042), semantic fluency (p = 0.220) and Hayling test (p = 0.010). As expected, given that the three subphenotypes of VIQ were all significant, when we separately tested VIQ itself this was also significant (p = 0.014)....Could the RIMS1 mutation be a chance finding, unrelated to the eye or cognitive phenotype? The intrafamilial distribution of cognitive measures argues that the detected mutation is most probably causative, especially as it segregates with both the eye phenotype (which becomes clinically symptomatic) and the cognitive enhancement. Co‐mingling in each outbred generation of mutation‐carrying and wild‐type sibs each with respective enhanced or normal cognitive phenotype and the respective co‐segregating impaired or normal visual phenotype renders extremely unlikely the possibility of an intrafamilial founder effect unrelated to the RIMS1 mutation, as supported quantitatively by our modelling, beyond linkage analysis alone.
We now report on the functional and structural effects of mutation in the eye‐ and brain‐expressed gene RIMS1, through the study of individuals from a family already reported to have retinal dystrophy caused by RIMS1 mutation.^12,13 To our knowledge, this is the only family so far reported with such a mutation: the eye phenotype is homogeneous in the family, and has been documented in detail.^13...We show that a mutation in RIMS1 is associated, in the only reported kindred with any RIMS1 mutation, with significantly enhanced cognitive function in at least the verbal (likely to be related to general ability, g) and executive domains. RIMS1 is an excellent candidate gene to influence cognitive function.
Indeed, precisely as one familiar with criticism of SNPs (and with this being a mutation) would expect, the genotyping turned up nothing in their sample:
We sequenced the mutation‐containing RIMS1 exon 13 in a panel of 50 unrelated individuals with autosomal dominant cone–rod dystrophy, but did not detect any mutations. Common variation in RIMS1 (uncorrelated with the rare Arg844His mutation) did not influence cognitive function in LBC1921, for either genotype or haplotype (see supplementary tables 2 and 33 available online at http://jmg.bmj.com/supplemental). To determine if mutation in RIMS1 might account for the upper extreme of performance on cognitive measures, the entire RIMS1 gene was sequenced in the top‐scoring 5% of the LBC (24 individuals). Only one, previously unreported, SNP was found, in residue 592, exon 9: it was synonymous, conserving a glutamic acid residue in an unremarkable region of the gene.
This suggests the only way to further (ethically) investigate this would be to either sequence many millions of people at staggering expense in the hopes of finding a second family with this mutation who had not been reported in the literature as regularly going blind for no reason, or to mutate animals and hope the relevant systems are similar enough that the result is not too misleading. (The paper cites research showing animals with no RIMS1 as being stupider, so at least we do know the gene affects the brain somehow.)
The fact that the mutation causes you to start to go blind in your 20s also helps its plausibility as it handily eliminates the evolutionary argument—going blind is a pretty damn big deal (“Any evolutionary advantage of this particular mutation could be counterbalanced by the concomitant severe visual phenotype, albeit late onset.”).
So, unless I’ve misunderstood some of the technical details, this actually strikes me as a plausible claim—albeit one that is useless for practical purposes like embryo selection since that exact mutation wouldn’t occur for anyone & if it did you probably would not want to pick that embryo because going blind is a terrible thing to inflict on someone.
(Of course, you might argue that the benefit is worth it, especially since given another 30 years or so, when the degeneration really sets in, you might have the option of either replacing your eyes with improved computer vision assistance methods/implants or perhaps using CRISPR retinal therapy to try to remove the mutation from just your eyes and preserve your vision that way. It would be a very risky move, though, in assuming the IQ boost is real and that progress will be fast enough to save your eyes.)
If you look at the fulltext, this is for a mutation analyzed via the old standard Mendelian techniques, not a brute-force SNP analysis, as your link refers to:
And the paper provides an excellent reason that there have been no replications so far (italics added):
Indeed, precisely as one familiar with criticism of SNPs (and with this being a mutation) would expect, the genotyping turned up nothing in their sample:
This suggests the only way to further (ethically) investigate this would be to either sequence many millions of people at staggering expense in the hopes of finding a second family with this mutation who had not been reported in the literature as regularly going blind for no reason, or to mutate animals and hope the relevant systems are similar enough that the result is not too misleading. (The paper cites research showing animals with no RIMS1 as being stupider, so at least we do know the gene affects the brain somehow.)
The fact that the mutation causes you to start to go blind in your 20s also helps its plausibility as it handily eliminates the evolutionary argument—going blind is a pretty damn big deal (“Any evolutionary advantage of this particular mutation could be counterbalanced by the concomitant severe visual phenotype, albeit late onset.”).
So, unless I’ve misunderstood some of the technical details, this actually strikes me as a plausible claim—albeit one that is useless for practical purposes like embryo selection since that exact mutation wouldn’t occur for anyone & if it did you probably would not want to pick that embryo because going blind is a terrible thing to inflict on someone.
(Of course, you might argue that the benefit is worth it, especially since given another 30 years or so, when the degeneration really sets in, you might have the option of either replacing your eyes with improved computer vision assistance methods/implants or perhaps using CRISPR retinal therapy to try to remove the mutation from just your eyes and preserve your vision that way. It would be a very risky move, though, in assuming the IQ boost is real and that progress will be fast enough to save your eyes.)
Thanks Gwern.