Saturday 20 May 2017

Computers, Elevators & the Miniaturization of the Thinking Machine

EXPLORING THE DIGITAL WORLD -----



This series of illustrated lectures was developed for the information of the Senior residents of the New Horizon Tower at Bloor and Dufferin St. in Toronto, Canada.


Please note that most of the Illustrations, and quite a lot of the information in this series comes from the Internet, and more specifically from Wikipedia, and has been edited for this presentation.


My thanks to Lindsay O'Brian, and Patrizia Palumbo who made the Power Point programs that are incorporated here.


Bill Coffman
"Spitfire Studio"






DEDICATED TO TWO OF MY GREAT GRANDCHILDREN, CONNOR AND STELLA,
WHO WILL ALWAYS LIVE IN THIS DIGITAL WORLD!








An ongoing topic in the News Media these days is the Autonomous Automobile. The car that drives itself with the aid of computers, or digital technology. (They never talk about these empty cars, driving round and round with nobody in them, looking for a place to park.—)



Every time we get on an elevator, we are using an “Autonomous” device operated with digital technology



The elevator is not a complicated device. Do you remember those old elevators in Eatons and Simpsons and in the fancy hotels?. The ones where the operators, ladies, or gentlemen with the white gloves, worked the levers that opened and closed the folding steel gates. With another lever, they controlled the gentle ascent and descent of the ‘cage’ and the soft stop at floor level ?

The operation of elevators is exactly the same today, except that when you push the button for the desired floor a digital device is activated. The door opens and closes automatically, with a delay if a digital sensor senses that there is still solid object in the path of the door closing. Then the computer follows your selection to control the gentle ascent or descent and the soft stop at floor level.

Our digital elevators at NHT even announce the direction of travel and tell you that they have arrived at a floor. (Just like the elevator operator in Eatons.) The elevator was an early example of digital age equipment. (And yes, there are ‘Analog’ safety devices there to protect you in the event of electrical failure or other emergency,)

It is perhaps a little unfair to compare the elevator that only travels in two directions on a tightly controlled route, to the Autonomous Automobile, but I am using the elevator as a familiar example of a mechanical device that is operated with digital technology. 

The so-called “Digital World” refers to Digital Technology, and not just computers. A hearing aid is not exactly an example of ‘high technology’ but it does use transistors, which are digital components.

Which brings up our definition for today———

To understand this digital thing, we have to understand the difference between ‘Analog’, and ‘Digital’. 

We are still living in an analog world.

 
As humans, we perceive the world essentially in analog. Everything we see and hear is perceived as a continuous transmission of information to our senses. Shapes, colours, light, sounds, smells, and feel. This continuous stream of information can be defined as analog data.

Before the digital age, sight and sound were recorded, stored, and transmitted as waves of analog information by mechanical or electrical means, on analog devices. These devices would then translate and display the stored information back to our senses of sight and sound using other electric or mechanical means. Radio, is an analog device.
The telephone was an analog device, as were the typewriter, camera, and other familiar equipment in the 20th century.

A watch or clock with hands is usually Analog, while a watch or clock displaying the time in numbers, is usually Digital.





As another example, a record player with a turntable is an analog device, while a CD player is digital. 



Music exists in analog form as sound waves or vibrations. On a record, these waves have been mechanically converted into bumps and ridges in the spiral grooves on the surface of the record. An analog turntable reads the bumps in the grooves on a record using a ’needle’ that mechanically creates a continuous electrical wave signal that is converted back to analog sound vibrations by electromagnetic ’speakers’. 



  
On a CD, the sound waves have been translated into a digital format that has been encoded onto the disc. A CD player reads imprinted binary digital data that has been stored on the surface of the disc. Using a tiny laser and a processor, it digitally translates the encoded stream back into analog sound waves through electromagnetic ‘speakers’. 




A Computer is a device that can be instructed to carry out a set of operations following a sequence of processes called a program, that is a part of the digital “memory” stored in the device. 

Computers are used as control systems for a very wide variety of industrial and consumer devices . This includes simple  devices, like microwave ovens and other kitchen appliances, industrial devices such as robots, and computer assisted machining. Personal computers, —‘Desk top’, ‘Laptop’, ‘Tabloid’, or ’Smart Phone’—use digital technology to input, store, transmit, and reproduce information.  The internet is run on computers and it connects millions of other computers.

The real, analog world exists on both the input and the display sides of the personal computer. The digital world exists in between…





Let’s take a look at the computer from an imaginary viewpoint.--anthropomorphization
Introducing The “Digital Dog”








Think of the Computer as a Dog you have bought that comes ready trained to do all sorts of useful things. You just have to learn the commands that tell him what to do.

He will get the Mail, - open it, and display it for you.
He will fetch the Newspaper, - Open it up page by page, or story by story, sports section or entertainment section. (He also does the same with magazines.)
He will find things in the library or the encyclopedia for you. All kinds of information.
He will even get the weather report. (And he can translate the metric temperature into Fahrenheit that you can understand)
He is a Smart Dog. - He can’t type, but if you type, he’ll show you what you are typing, print it out for you, or mail it to one person or all of your friends and relatives. While you are typing, he’ll correct your spelling, and sometimes your grammar. He will also file your message away so that he can find it again for you.
He is a Very Smart Dog - You take photos, he will develop them, adjust them, enhance them, and even add them to messages. And of course he will file them away in his own little storage system.
He is a Very, Very Smart Dog - He can fetch and play music. All kinds of music. And store it in his files so you can play it again. 
He can also get and play videos, and episodes from the TV and Movies too. You have to tell him what you want to watch. (He may have suggestions for you.)
And he is a Guide Dog who will guide you around town or around the world, and can find all kinds of places with things to eat, see, or do. He can even show you pictures of where you are going.
There are special breeds of this new Smart Dog, tiny breeds, who can even fetch and send telephone messages.
This dog comes well trained, but you can give him added instructions that will have him do other, special things for you.
You don’t have to feed him, but you will still have to pay Ma Bell, Mr Rogers, or some other ‘Provider’ for his upkeep. You will never have to take him for ‘Walkies’.
You just have to learn the commands that will get him to do these things for you. And he will show you every one of them.
He won’t wag his tail, but you will never have to worry about him shedding.
If you can think of it as a “Digital Dog”, that’s your computer……
+++++++++++++++++++++++++++++++++++++++++++++++++++++
HISTORY
The history of computers covers the developments from early simple calculation devices to modern day computers. 



Calculators

Before the 20th century, most calculations were done by humans based on counting with fingers.



Devices have been used to aid computation for thousands of years. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the developing civilizations included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or containers of grains.





 Early mechanical tools to help humans with mechanical calculations or arithmetical tasks, such as the abacus, were called "calculating machines", or calculators. The abacus was developed from devices used in Summeria and Babylon as early as 2400 BC. and used by the early Egyptians, Persians, Greeks, and Romans. The Chinese abacus, dating from around 200 BC, used a different counting system.


There are two beads on each rod in the upper deck and five beads each in the bottom for both decimal and hexadecimal (-Base 16-)calculation. The beads are counted by moving them up or down towards the beam. If you move them toward the beam, you count their value. If you move away, you don't count their value.


(When Mr Hontani a salesman from Shimano in Japan, visited me at CCM with their Bicycle products, the first time- around 1960, he used a small abacus to calculate his prices. On his next visit he used a small hand held Sanyo calculator…)





Many of the early calculating instruments were astronomical and navigational devices used by sailors.The Antikythera mechanism is believed to be the earliest mechanical analog "calculator”. Designed to calculate astronomical positions, it was discovered in 1901 in a shipwreck off the Greek island of Antikythera   and has been dated to around 100 BC. Devices of a level of complexity comparable to this mechanism would not reappear until a thousand years later.



The Astrolabe was invented in the Hellenic world in either the 1st or 2nd centuries BC. The astrolabe was effectively an analog calculator capable of working out several different kinds of problems in astronomy. An astrolabe incorporating a mechanical calendar was invented in Persia in 1235. 
During the middle ages, many forms of reckoning boards or tables were invented to facilitate commerce. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, to calculate sums of money.

The Sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation.
The Slide Rule was invented around 1620–1630, shortly after the publication of the concept of the logarithm. It is a hand-operated analog calculator for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as logarithms and exponentials, and other trigonometric functions. 



In the 1770s a Swiss watchmaker, built a mechanical doll that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire at Neuchatel, Switzerland, and still operates. 





Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. In 1801, Jacquard developed a loom in which the pattern being woven was controlled by a punched paper tape. The paper tape could be changed without changing the mechanical design of the loom. This was a landmark achievement in programmability. His machine was an improvement over similar weaving looms. 
Punched paper bands were also used in ‘player’ pianos 

The first mechanical calculators required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result.  Numbers could also be automatically manipulated by mechanical mechanisms as in an adding machine or cash register. Although this approach generally required more complex mechanisms, it greatly increased the precision of results. More sophisticated electrical machines did specialized mechanical (Analog) calculations in the early 20th century.




By 1920, electromechanical tabulating machines could add, subtract and print accumulated totals. When the United States instituted Social Security in 1935, IBM punched card systems were used to process records of 26 million workers. Punched cards became standard in industry and government for accounting and administration.








Punched card techniques became sufficiently advanced to solve some complex mathematical calculations with electric accounting machines or tabulating machines. Such machines were used during World War II for cryptographic statistical processing, as well as a vast number of administrative uses. 


The first digital electronic calculating machines were developed and built in Britain during World War II to break the Nazi “Enigma” codes. 
In the late 20th century, a series of breakthroughs, such as miniaturized calculators and integrated circuits, caused digital calculators to largely replace analog calculators. The cost of computers gradually became so low that by the 1990s, personal computers, and then, in the 2000s, mobile computers, replaced calculators in general use.

First computing device
Charles Babbage invented the first mechanical computer in the early 19th century. The input of programs and data was to be provided to the machine by punched cards, a method being used at the time to direct mechanical looms. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The Babbage Engine became the first design for a general-purpose computer that met modern standards.

The machine was unfortunately about 100 years ahead of its time. Eventually, the project was dissolved with the decision of the British Government to cease funding.








Analog computers
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.

 By the 1950s the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education control systems and slide rules in engineering.





By 1938 the US Navy had developed an electromechanical analog computer small enough to use aboard a submarine. This was the Torpedo Data Computer, which used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II similar devices were developed in other countries as well.
Early digital computers were electromechanical. Electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers that used vacuum tubes. 

The German Z2 in 1939, was one of the earliest examples of an electromechanical relay computer.





In 1941, this was followed with the Z3, the world's first working electromechanical, programable, fully automatic digital computer. This machine was built with 2000 electronic switches. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects. Using a binary coding system meant that the German machines were easier to build and potentially more reliable, given the technologies available at that time.The Z3 was used for aerodynamic calculations but was destroyed in a bombing raid on Berlin in late 1943.


The British Post Office in 1939, converted a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US the first "automatic electronic digital computer" was developed in 1942.This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory.

(Vacuum Tubes, or Tubes - the English called them ‘Valves’ - are the same kind that we had in our old radios and early TVs. Their function was to modulate, an electromagnetic wave. That is to change the amplitude or frequency of a wave in accordance with the variations of a second signal, much as we modulate the strength, tone, or pitch of one's voice. Vacuum tubes carried elements called cathodes and diodes, that heated up like a light bulb as part of their function. Early computing devices required a lot of cooling! Vacuum Tubes were replaced by Integrated Circuits and Transistors in the digital world.)



-Digital Computers
Bletchley Park was the central site for British codebreaking efforts during World War II. Run by the Government Code and Cypher School, it regularly penetrated the secret communications of the Axis powers. (The official historian of World War II British Intelligence has written that the “Ultra” intelligence produced at Bletchley shortened the war by two to four years.) 






The British achieved a number of successes at deciphering coded German military communications. The German Enigma codes, were first decoded with the help of an electro-mechanical computing device, called the “Bombe”, that could find settings for the Enigma Machine in 1941. 


You may have seen The Imitation Game, a 2014 American historical drama movie loosely based on the biography of Alan Turing: It stars Benedict Cumberbatch as real-life British cryptanalyst Alan Turing, who decrypted German intelligence codes for the British government during World War II. The movie included a replica of the Bombe device.


To crack the more sophisticated Nazi Lorenz codes, used for high-level Army communications, the British team spent eleven months from early February 1943 designing and building the first “Colossus”, the world's first electronic digital programmable computer. It used a large number of vacuum tubes. It had punched paper-tape input and was capable of being configured to perform a variety of possible solutions on its data. Colossus Mark I contained 1500 vacuum tubes, but Mark II with 2400 tubes, was both 5 times faster and simpler to operate than Mark 1, greatly speeding the decoding process.

Built in the US in 1946, ENIAC (Electronic Numerical Integrator and Computer) was the first American electronic programmable computer. Although the ENIAC was similar to the Colossus it was over 1,000 times faster and more flexible. It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes).  ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. Computers have been made smaller.

The stored-program computer can store a set of instructions in its memory, as a program that details the calculation process. The world's first stored program computer was built at the Victoria University of Manchester in Britain, and ran its first program in 1948. Although the computer was considered "small and primitive" by the standards of its time, it was the first working machine to contain all of the elements essential to a modern electronic computer. A project was initiated at the university to develop it into a more usable computer. The Manchester Mark 1 in 1951 quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. the second and only other Mark I was sold at a major loss to the University of Toronto, where it was re-christened FERUT.


Transistors
The transistor was invented in 1947 by the Bell Telephone Company. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space.



Integrated circuits
The next great advance in computing power came with the advent of the integrated circuit chip in 1958. An integrated circuit, or IC, is small chip that can function as an amplifier, oscillator, timer, microprocessor, or even computer memory. An IC is a small wafer, usually made of silicon, that can hold anywhere from hundreds to millions of transistors, resistors, and capacitors. These extremely small electronic devices can perform calculations and store data using either digital or analog technology.

This new development heralded an explosion in the commercial and personal use of computers and led to the invention of the  microprocessor. 


Networking and the Internet
Computers have been used to coordinate information between multiple locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system in 1958.The first large-scale computer communications network, connected 23 hardened computer sites in the US and Canada. Its task was to detect incoming Soviet bombers and direct interceptor aircraft to destroy them. The air defence system used two computers, each of which used a full megawatt of power to drive its 55,000 vacuum tubes, 175,000 diodes and 13,000 transistors.. 

In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. In time, the network spread beyond academic and military institutions and became known as the Internet. The emergence of networking involved a redefinition of the nature and boundaries of the computer. 

Computer operating systems and applications were modified to include the ability to access the resources of other computers on the network, as extensions of the resources of an individual computer. Initially these facilities were available primarily to people working in high-tech environments, but in the 1990s the spread of applications like e-mail and the World Wide Web, combined with the development of cheap, fast networking technologies saw computer networking become commonplace. A very large proportion of personal computers regularly connect to the Internet to communicate and receive information. "Wireless" networking, using mobile phone networks, has meant networking is becoming increasingly widespread.







The Univac 1 was the first commercial computer to attract widespread public attention. Manufactured by Remington Rand, in 1951, Univac computers were used in many different applications but utilities, insurance companies and the US military were major customers. One biblical scholar even used a Univac 1 to compile a concordance to the King James version of the Bible. The Univac 1 used 5,200 vacuum tubes and weighed 29,000 pounds. Remington Rand eventually sold 46 Univac 1s at more than $1 million each.


In 1964, the IBM 1401 mainframe, the first in the series, replaced earlier vacuum tube technology with smaller, more reliable transistors. Demand called for more than 12,000 of the 1401 computers, and the machine´s success made a strong case for using general-purpose computers rather than specialized systems. By the mid-1960s, nearly half of all computers in the world were IBM 1401s.


In 1964, the IBM 1401 mainframe, the first in the series, replaced earlier vacuum tube technology with smaller, more reliable transistors. Demand called for more than 12,000 of the 1401 computers, and the machine´s success made a strong case for using general-purpose computers rather than specialized systems. By the mid-1960s, nearly half of all computers in the world were IBM 1401s.



In 1964, the Canadian Chalk River Nuclear Lab needed a special device to monitor a reactor. Instead of designing a custom controller, two young engineers from Digital Equipment Corporation (DEC) -- Gordon Bell and Edson de Castro  developed a small, general purpose computer and programmed it to do the job. A later version of that machine became the PDP-8, the first commercially successful minicomputer. It sold for $18,000, one-fifth the price of a small IBM System/360 mainframe. Because of its speed, small size, and reasonable cost, the PDP-8 was sold by the thousands to manufacturing plants, small businesses, and scientific laboratories around the world.









Also in 1964, the IBM System/360 was a major event in the history of computing. On April 7, IBM announced five models of System/360, aimed at both business and scientific customers and all models could run the same software, largely without modification. IBM’s initial investment of $5 billion was quickly returned as orders for the system climbed to 1,000 per month within two years. At the time IBM released the System/360, the company had just made the transition from discrete transistors to integrated circuits, and its major source of revenue began to move from punched card equipment to electronic computer systems.
____________________________
Ken Iverson -At this point it is appropriate to remember a Canadian who was a major contributor to the Digital World, and who had close ties to a resident of this Tower.

Ken Iverson, the late husband of Jean Iverson, was the first in the world to prepare and teach a course in the new science of Business Data Processing at Harvard University in the late 1960’s. In1965 he went to work at IBM on the development of the software programming for the IBM 360 called APL 360.

This discourse is mostly about the Hardware,- the devices that produced the Digital Age. Ken worked on the software side, which is the operating system underlying all that is Digital. (I don’t pretend to understand it.) His mathematical interpretations played a major role in the early development of these programs. 

Ken Iverson received the Turing Award in 1979 for his work. He passed away in 2004.

________________________
_________________________________________________________________________


The Olivetti Programma 101 went on sale in 1965. This printing programmable calculator could do addition, subtraction, multiplication, and division, as well as calculate square roots. 40,000 were sold, including 10 to NASA for use on the Apollo space project.



In 1968, the first minicomputer was introduced by the Data General company, the Nova minicomputer. It had 32 KB of memory and sold for $8,000.  The Nova line of computers continued through the 1970s, and influenced later systems like the Xerox Alto and Apple 1.


The Xerox Alto in 1974 was a groundbreaking computer with wide influence on the computer industry. It was based on a graphical user interface using windows, icons, and a mouse, and worked together with other Altos over a local area network. It could also share files and print out documents on an advanced Xerox laser printer.


Xerox physicist Gary Starkweather realized in 1967 that exposing a copy machine’s light-sensitive drum to a paper original isn’t the only way to create an image. A computer could “write” it with a laser instead. Xerox wasn’t interested. So in 1971, Starkweather transferred to Xerox Palo Alto Research Centre (PARC), away from corporate oversight. Within a year, he had built the world’s first laser printer, launching a new era in computer printing, generating billions of dollars in revenue for Xerox. The laser printer was used with PARC’s Alto computer, and was commercialized as the Xerox 9700.


Designed by Steve Wozniak in 1976, and marketed by his friend Steve Jobs, the Apple-1 was a single-board computer for hobbyists. With an order for 50 assembled systems from a Mountain View, California computer store in hand, the pair started a new company, naming it Apple Computer, Inc. In all, about 200 of the boards were sold before Apple announced the follow-on Apple II a year later as a ready-to-use computer for consumers.








Sold complete with a main logic board, switching power supply, keyboard, case, manual, game paddles, and cassette tape containing the game Breakout, the Apple-II found popularity far beyond the hobbyist community which made up Apple’s user community until then. When connected to a colour television set, the Apple II produced brilliant colour graphics for the time. Millions of Apple IIs were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers. Apple gave away thousands of Apple IIs to school, giving a new generation their first access to personal computers.


The first of several personal computers released in 1977, the Commodore PET came fully assembled with either 4 or 8 KB of memory, a built-in cassette tape drive, and a membrane keyboard. The PET was popular with schools and for use as a home computer. After the success of the PET, Commodore remained a major player in the personal computer market into the 1990s.





Also in 1977, in the first month after its release Tandy Radio Shack´s first desktop computer — the TRS-80 — sold 10,000 units. The TRS-80 was priced at $599.95, included easy-to-understand manuals that assumed no prior knowledge on the part of the user. The TRS-80 proved popular with schools, as well as for home use. The TRS-80 line of computers later included colour, portable, and handheld versions before being discontinued in the early 1990s.





In 1981, IBM's brand recognition, along with a massive marketing campaign, ignited the fast growth of the personal computer market with the announcement of its own personal computer (PC). The first IBM PC, formally known as the IBM Model 5150, revolutionized business computing by becoming the first PC to gain widespread adoption by industry. The IBM PC was widely copied (“cloned”) and led to the creation of a vast “ecosystem” of software, peripherals, and other commodities for use with the platform.





Apple introduced the Macintosh with a television commercial during the 1984 Super Bowl. The Macintosh was the first successful mouse-driven computer. Its price was $2,500. Applications that came as part of the package included MacPaint, which made use of the mouse, and MacWrite, which demonstrated WYSIWYG (What You See Is What You Get) word processing.



IBM introduced the PS/2 in 1987, and shipped more than 1 million units by the end of the first year. IBM released a new operating system, OS/2, at the same time, allowing the use of a mouse with IBM PCs for the first time. Many credit the PS/2 for making the 3.5-inch floppy disk drive standard for IBM computers. The system was IBM's response to losing control of the PC market with the rise of widespread copying of the original IBM PC design by “clone” makers.





In 1991 Apple completely redesigned its line of portable computers as PowerBooks. All three PowerBooks introduced featured a built-in trackball, internal floppy drive, and palm rests, which would eventually become typical of 1990s laptop design. The PowerBook 100 was the entry-level machine, while the PowerBook 140 was more powerful and had a larger memory. The PowerBook 170 was the high-end model, featuring an active matrix display, faster processor, as well as a floating point unit. The PowerBook line of computers was discontinued in 2006.

In 1995, the IBM ThinkPad 701 laptop computer, officially known as the Track Write, was introduced The keyboard was comprised of three roughly triangular interlocking pieces, which formed a full-sized keyboard when the laptop was opened -- resulting in a keyboard significantly wider than the case. This keyboard design was dubbed “the Butterfly.” The need for such a design was lessened as laptop screens grew wider.



Apple, in 1998 introduced its translucent iMac machines with a translucent coloured housing integrating the CRT (Cathode Ray Tube) and the keyboard which sold for about $1,300. The machine was noted for its ease-of-use and included a 'manual' that contained only a few pictures and less than 20 words. 






BlackBerry is a line of smart phones and services designed and marketed by Canadian company Blackberry Ltd. (formerly known as Research In Motion Limited).
In 2003, the more commonly known smartphone BlackBerry was released, which supports email, mobile telephone, text messaging, and other wireless information services.
BlackBerry was considered one of the most prominent smartphone vendors in the world, specializing in secure communications and mobile productivity. At its peak in September 2013, there were 85 million BlackBerry subscribers worldwide. However, BlackBerry has since lost its dominant position in the market due to the success of the Android and Apple iOS platforms; the same numbers had fallen to 23 million in March 2016. On September 28, 2016, Blackberry announced it would cease designing its own phones in favour of outsourcing to partners.

In 2007 Apple launched the iPhone - a combination of web browser, music player and cell phone with touchscreen operation - which could download new functions in the form of "apps" (applications) from the online Apple store. The smartphone also had built-in GPS navigation, high-definition camera, texting, calendar, voice dictation, and weather reports. Many other manufacturers quickly followed with their own hand-held computer-telephones.





The Apple iPad in 2010, combined many of the popular capabilities of the iPhone, such as built-in high-definition camera, access to the iTunes Store, and audio-video capabilities, but with a nine-inch screen and without the phone. Apps, games, and accessories helped spur the popularity of the iPad and led to its adoption in thousands of different applications from movie making, creating art, making music, inventory control and point-of-sale systems, to name but a few.

Other manufacturers have produced competitive devices.




Mobile computers became dominant in the 21st century.
With the continued miniaturization of computing resources, and advancements in portable battery life, portable computers grew in popularity in the 2000s.The same developments that spurred the growth of laptop computers and other portable computers allowed manufacturers to integrate computing resources into cellular phones. These smartphones and tablets run on a variety of operating systems that are not necessarily the same as those used in laptop and desktop computers. Smartphones and tablets have become the dominant computing device on the market.






So that’s where the computer and the smartphone came from. I will be covering the smartphone in more detail in a presentation later in the series.



No comments:

Post a Comment