News | Story

What Will Happen Now That Computers Have Connected Us All, For Better Or Worse, Is Unclear

COMPUTING Now that computers connect us all, for better and worse, what’s next?

Image Source: HC

This essay was written, edited, and produced entirely on a laptop computer, accessible online. This foldable and transportable technology would have taken computer professionals by surprise just a few decades ago. Such devices would have seemed to be nothing short of miracles before then. Hundreds of millions of small processing units power the machines, which can do sophisticated computations thanks to millions of lines of software code written by many people all over the globe and executed by the devices. You may make a selection by clicking, tapping, typing, or speaking, and the result is shown on the screen in real-time without lag.

Computers used to be so large that they took up whole rooms in some instances. They’re now everywhere and undetectable, implanted in everything from watches to automotive engines to cameras to televisions to toys, and they’re becoming more sophisticated. They are in charge of overseeing electrical distribution networks, doing scientific data analysis, and predicting weather. Without them, it would not be easy to fathom modern society. As scientists try to make computers faster and program more intelligent, they are also concerned with ensuring that technology is utilized in a morally responsible manner. More than a century of technical improvement has been used in their endeavors.

A programmable computer designed by English scientist Charles Babbage in 1833 served as a precursor to today’s computing architecture. The machine components, influenced by current computer design, included a “store” for storing numbers, a “mill” for executing operations, an instruction reader, and a printer. Logic techniques such as branching and recursion were also available on this Analytical Engine (if X, then Y). Ada Lovelace, one of Babbage’s friends, recognized that the numbers it could manipulate could represent anything, including music, based on Babbage’s description of the system. Although Babbage only built a portion of the machine, Ada Lovelace recognized that the numbers it could manipulate could represent anything, including music. As she explained in her article, the development of a “new, vast, and strong language” for analytical purposes in the future is underway.

The workings of the designed machine were second nature to Lovelace, who became known as the “first programmer” after discovering the device.
British mathematician Alan Turing proposed the concept of a computer that could rewrite its instructions, allowing it to be infinitely programmable, in 1936, according to the journal Science. Using a limited set of operations, he could simulate any computer of any complexity, giving him the nickname “universal Turing machine.”

Colossus, the world’s first dependable electronic digital computer, was built in 1943 to assist the United Kingdom in deciphering wartime codes. It made use of vacuum tubes, which are devices for directing the flow of electrons, rather than moving mechanical elements, such as the cogwheels of the Analytical Engine. This made Colossus very quick, but it also required engineers to physically rewire it every time they needed it to execute a different job.
In the wake of Turing’s vision of a more readily reprogrammable computer, the team that built the United States’ first electronic digital computer, the ENIAC, drew out a new design for the EDVAC, which became the world’s first electronic digital computer. John von Neumann, a mathematician who wrote the book

In 1947, researchers at Bell Telephone Laboratories invented the transistor, a piece of circuitry in which the application of voltage (electrical pressure) or current controls the flow of electrons between two points. It came to replace the slower and less-efficient vacuum tubes.

In 1958 and 1959, researchers at Texas Instruments and Fairchild Semiconductor independently invented integrated circuits, in which transistors and their supporting circuitry were fabricated on a chip in one process.

For a long time, only experts could program computers. Then in 1957, IBM released FORTRAN, a programming language that was much easier to understand. It’s still in use today. In 1981, the company unveiled the IBM PC, and Microsoft released its operating system called MS-DOS, together with expanding the reach of computers into homes and offices. Apple further personalized computing with the operating systems for their Lisa in 1982, and Macintosh, in 1984. Both systems popularized graphical user interfaces, or GUIs, offering users a mouse cursor instead of a command line.

Related News

Comment

Your email address will not be published.

Share The News

Follow Us

Newsletter