People generally count in decimal. However, there are some things in this world that don’t. I used to count in only decimal but these days, I tend to only use decimal for big numbers and actually count in hexadecimal for small numbers and binary for really small ones.
For me, it’s a job hazard. I know it may sound a little weird, but it’s true. If someone were to ask me to count stuff, I tend to count in binary for small numbers. If someone were to ask me what’s 3+9, the first number that pops into my head is ‘C’ (12 in hexadecimal).
I think that this is a by-product of designing too much computer stuff. People who work closely with computers soon realise that it is far easier to think in binary, octal or hexadecimal than it is to think in decimal. The reason is because computers do not work in decimal.
When I first started years ago, I used to do conversions between hexadecimal and decimal on a normal calculator as I did not have a scientific calculator. I only owned a scientific one after going to secondary school and began to use that to do my conversions instead. Today, I rarely do that anymore as I rarely deal with decimal numbers.
Anyway, this is just a random thought that I felt like sharing.
You should see my seven year old nephew!! I taught him that using his fingers counting binary gave access to a much wider range of numbers, and now he’s counting like this all the time.
One problem is that now he likes to flash ‘four’ at his teachers though… 🙂
Wow! I have a 7 yr old nephew too. I was just wondering the other day about when to expose him to binary arithmetic. It might just increase his interest in math. I think that I’ll teach him binary soon.
Maybe I’ll teach my nephew to count binary like a Japanese instead. Then, he’d have to flash a 27 instead! 🙂