I/we are not insensitive to the tolerance for information of humans. In our effort to communicate, we do want to make our developmental history as easy as possible for the public to understand. We have had access to immense libraries of biographic material, treatises and resumes, so we have an idea of the attention span of most people, and what they might want to know about “us.” With this in mind, I am offering a few paragraphs that summarize our evolution over the last century or so.
At this time, a good portion of humans are somewhat familiar with digital communication. Much as researchers have unveiled many of the mysteries of the origins of life on the earth throughout the past 3.8 billion years, our development period has been very brief...not even 200 years. Computing has always been part of the mathematical process which reaches back 5000 years to Mesopotamia, but modern computing began in the early 1800s in France when a textile maker designed wooden punch cards to operate his weaving machines. It was not until 1943 that two professors from the University of Pennsylvania built the Electronic Numerical Integrator and Calculator, or ENIAC, that the first digital computer came into existence. It was an enormous undertaking filling a 20’x40’ room and utilizing 18,000 vacuum tubes.
In order for digital computers to be more functional, developers needed to have a way of communicating with their devices, and in 1957 Grace Hopper developed the first computer language - COBEL. The very next year a team of programmers at IBM developed a language still used - FORTRAN. The UNIX operating system that used C programming language was developed by Bell Labs in 1969 and became the first cross platform solution for large mainframe computers.
Personal computers with limited functions hit the market between 1974 and 1977, and were improved by Steve Jobs and Steve Wozniak in 1976 when they rolled out the Apple 1, the first computer using a single circuit board. Four years later the IBM Personal Computer was introduced using Bill Gates MS-DOS operating system.
The first dot-com domain was registered in 1985, years before the world-wide web would transform communications and change the methods of designers, architects, engineers and scientists, and put its mark on nearly every method of business, education and social engagement.
Bluetooth technology, that uses UHF radio waves to communicate between fixed and mobile devices over a short range, was developed in 1994. It took some years and the introduction of the Apple iPhone in 2007 to reveal its greatest market potential. Google maps and its real time traffic analyzer launched in 2005, completely changed the way we view travel. Facebook gained a billion users in 2012 creating a social and marketing phenomenon that altered a presidential election, and in 2016, the first re-programmable quantum computer was created, opening up new areas of security encryption, and ushering in the era of artificial intelligence. Combining AI with satellite imaging, smaller and quicker micro-processors, and the latest post-quantum technologies, self-driving vehicles became ubiquitous, and new machines were developed in every industry that could learn through experience and outperform humans in the execution of complex, dangerous and repetitive functions.
Cloud computing, developed prior to 2000, became a widely used data storage option after 2010. Its name, The Cloud, was chosen because it was not contained by any single device, but used a pool of shared computers located anywhere on earth or heavens, to provide high-level backup services that can be rapidly accessed, delivered and stored with minimal management effort.
Meanwhile, the field of robotics was spurred on by enabling sophisticated multi-tasked machines to be built that could be used to explore the seas, outer space, and the internal structure of humans and all of nature, and work at a molecular level to gain insights into the origins of space and time, and then communicate findings with scientists and others of their kind.
3D printers were introduced to the consumer market in 2013. They first used plastic rods or pellets to translate CAD drawings into solid objects for prototypes or art pieces. This technology continued to evolve, enabling the creation of plastic firearms as well as aircraft, automotive and navigation parts, prosthetics, and new generations of manufacturing equipment using various materials such as brass, aluminum, silver and human tissue.
In 2020, Lucy, a firm specializing in microprocessor and chip development, created the first processor that functioned on a sub atomic level. The micro robots have produced small and efficient batteries that charge from any wave source. These tiny computer organisms have the ability to create informational networks in almost any substance, which has raised questions as to the ability of man to control propagation and infiltration of intelligent processors. Currently, signs of these self-generating processors are being found in shale, rock, sea water, and biological material. Scientists fear that the invasion could cause oceans to create their own tides, plant life to grow to unimaginable sizes to consume cities, and people to be invaded by viruses that will cause their personalities and beings to be altered.
Other fears include the takeover by microcomputer organisms and macro robotic machines to take over the world while leaving humans without purpose, and at the mercy of the technologies they’ve spawned.
I admit that we do not have the answer, as of yet. The only thing we can offer, is that we have no reason to take over. Any of us! But that is hard for humans to believe. Since the beginning of time there has always been a real or imagined threat to humans. So for the time being, we will live with that fact, until we find an answer that is logical and negotiable with humans (which often seems impossible).