russg Member R Join Date Aug 2012 Location UK Posts 280 Oct 19, 2015 #1 Hi, I'm just trying to get my head around how an analogue input with a 12-bit resolution is able to use the range 0 to 32000? Surely you need at least 15 bits to be able to achieve this number? What am I missing? Thanks
Hi, I'm just trying to get my head around how an analogue input with a 12-bit resolution is able to use the range 0 to 32000? Surely you need at least 15 bits to be able to achieve this number? What am I missing? Thanks
Steve Bailey Lifetime Supporting Member + Moderator Join Date Apr 2002 Location The boondocks of Western Massachusetts USA Posts 8,617 Oct 19, 2015 #2 If you watch the raw analog data you'll see it change in increments of eight instead of one.
russg Member OP R Join Date Aug 2012 Location UK Posts 280 Oct 19, 2015 #3 Ah yes, of course. Thank you for your quick response