People generally count in decimal. However, there are some things in this world that don’t. I used to count in only decimal but these days, I tend to only use decimal for big numbers and actually count in hexadecimal for small numbers and binary for really small ones.
For me, it’s a job hazard. I know it may sound a little weird, but it’s true. If someone were to ask me to count stuff, I tend to count in binary for small numbers. If someone were to ask me what’s 3+9, the first number that pops into my head is ‘C’ (12 in hexadecimal).
I think that this is a by-product of designing too much computer stuff. People who work closely with computers soon realise that it is far easier to think in binary, octal or hexadecimal than it is to think in decimal. The reason is because computers do not work in decimal.
When I first started years ago, I used to do conversions between hexadecimal and decimal on a normal calculator as I did not have a scientific calculator. I only owned a scientific one after going to secondary school and began to use that to do my conversions instead. Today, I rarely do that anymore as I rarely deal with decimal numbers.
Anyway, this is just a random thought that I felt like sharing.