Those interested in technology most probably have heard about Moore’s Law, named after Gordon E. Moore, co-founder of Intel Corporation together with Robert Noyce in 1968. In 1965 Gordon Moore observed that the total number of elemental components in integrated circuits (chips) doubled every year and would continue to do so.
During the next 4 decades the definition of Moore’s original law has been reinterpreted and redefined several times. Probably the most known redefinition is a quote from an Intel executive stating that microproecessor performance doubles every 18 months. Nowadays chip performances still increase with a similar rhythm, but this is not true for the number and densities of the basic element Moore talked about back in 1965. Actually, the latter is only doubling every three years.
But then there’s another Moore´s law, a much less known related one: “Demi” Moore’s law. I couldn’t find who defined this law, but here it is:
“The value of information technology progresses at half the speed predicted by Moore´s Law.”
This is a genuine paradox: more information can be produced and sent in less time, thanks to the inventions born out of human brains, but the human brain´s capacity isn’t evolving fast enough to absorb it all.
The total digital information available duplicates faster and faster and at this time is estimated to be happening every 14 months. People spend more and more of their time consuming information just because it´s available, shifting gradually from a pushing to a pulling mode to gather information, even if it’s irrelevant! Technology progress exceeds our capacity to handle all the information is reaching us, causing more and more people getting stressed by information overload or even being unable to distinguish correct from incorrect information.
So, is Moore & Moore (or more and more) good for human kind? I guess we’ll never know.