Menu

Blog

Nov 27, 2018

Opinion: Some months ago, I introduced the idea of quantum computing in this column

Posted by in categories: quantum physics, robotics/AI, transportation

All of today’s computing takes its root from the world of “bits”, where a transistor bit, which lies at the heart of any computing chip, can only be in one of two electrical states: on or off. When on, the bit takes on a value of “1” and when off, it takes on a value of “0”, constraining the bit to only one of two (binary) values. All tasks performed by a computer-like device, whether a simple calculator or a sophisticated computer, are constrained by this binary rule.

Eight bits make up what is called a “byte”. Today, our computing is based on increasing the number of bytes into kilobytes, megabytes, gigabytes and so on. All computing advances we have had thus far, including artificially intelligent programmes, and driverless cars are ultimately reduced to the binary world of the bit.

This is a natural extension of western thought; for centuries, western philosophy has followed the principles of Aristotelian logic, which is based on the law of identity (A is A), the law of contradiction (A is not non-A), and the law of the excluded middle (A cannot be both A and non-A at the same time, just as non-A cannot be both non-A and A at the same time).

Read more

Comments are closed.