whenever wilcox.test is given a constant vector, it exits with an error: > wilcox.test(c(1,1,1,1,1),c(1,1,1,1,1,1),conf.int=T) Error in if (f.lower <= 0) return(mumin) : missing value where TRUE/FALSE needed this can be particularly disruptive (and obscure) if you are using wilcox.test as part of an apply.

To be precise, this only occurs when conf.int=TRUE. The proximate cause is that SIGMA.CI becomes zero so there's a divide by zero condition. Note, by the way, that the corner case wilcox.test(1,2,conf.int=TRUE) gives the inconsistent ------- 0 percent confidence interval: -1 -1 sample estimates: difference in location -1 Warning message: In wilcox.test.default(1, 2, conf.i = T) : Requested conf.level not achievable --------- (this could well be a platform-dependent rounding issue) An easy fix would seem to be to make the internal root() function return NaN if it encounters is.na(f.lower), but perhaps an explicit test would be better. |

changed in 2.12.0