The ENOB formula for a perfect ADC is 6.02*bits+1.76, not just 6*bits. The rms quantization noise is sqrt(1/6)
of an LSB not 1/2 an LSB, which accounts for the extra amount.
When testing an ADC to measure effective number of bits and its close to ideal you cannot use a fixed input
voltage as the test signal since how well the converter performs would depend on how close to the centre of
a voltage band the test voltage happened to be. However if the error is several LSBs or more this becomes
unimportant and the prob. distribution becomes a useful measure of rms quantization noise.
Sampling a sinusoid and measuring the noise floor from the power spectral density plot is a method that will
work in either case, and it will show up any spurs in the spectrum too - if the signal is high purity you can
measure non-linearities too.
Using a slowly ramping voltage and then plotting a histogram (having corrected for the slope) is probably
an easier method as the signal source can be a large capacitor with bleed resistor - this is basically a least-
squares linear regression.