Tuesday, April 29, 2014

DID YOU KNOW HIM..


1963: Douglas Engelbart Invents the Mouse











Doug Engelbart demonstrates the future of computing in this videoconference screen-grab from his historical 1968 presentation. Photo courtesy Bootstrap Alliance
Each time you click your mouse, you're paying homage to a UC Berkeley College of Engineering alumnus. Douglas Carl Engelbart, who received his Ph.D. in electrical engineering in 1955, not only invented the mouse but also helped define the way in which we interact with personal computers to this day—from multiple windows to hypertext links.

Born in 1925 on a farmstead near Portland, Oregon, Engelbart studied electrical engineering at Oregon State University and served in the Navy as a radar technician before completing his degree. While working at what is now NASA Ames, Engelbart experienced an epiphany driving to work.

According to his website, Engelbart envisioned "people sitting in front of cathode-ray-tube displays, 'flying around' in an information space where they could formulate and portray their concepts in ways that could better harness sensory, perceptual and cognitive capabilities heretofore gone untapped. Then they would communicate and communally organize their ideas with incredible speed and flexibility."

Engelbart brought his vision to UC Berkeley where one of the first general-purpose digital computers, the CalDiC, was under development. After completing his Ph.D. and a brief stint as an assistant professor in the College of Engineering, Engelbart took a position at the Stanford Research Institute. It was there that he wrote his seminal 1962 paper "Augmenting Human Intellect: A Conceptual Framework."













The first mouse was carved from wood and tracked motion via two wheels mounted on the bottom instead of the ball employed in today's models.
Photo courtesy Bootstrap Alliance

At his Augmentation Research Center, funded by the Department of Defense's Advanced Research Projects Agency (now DARPA) Engelbart and his talented team of researchers developed the technologies necessary to realize their leader's vision. One of the myriad research projects included an evaluation of various available "screen selection" devices, light pens and their ilk, that would dovetail with new forms of networked computer interaction. One approach Engelbert tossed into the mix was an idea for a device he had batted around for several years. His trusted collaborator Bill English built the first model out of wood and the team collectively began calling it the "mouse" because of its resemblance to a rodent. As the experiments continued, the mouse trounced the other devices in terms of usability. (A knee-operated mouse didn't make the grade.)

In 1967, Engelbart's research lab became the second node on the ARPANet, the predecessor to the Internet. This enabled the group to further develop their On-Line System (NLS), the first collaborative and integrated digital environment. The following year, at the Fall Joint Computer Conference in San Francisco, Engelbart and his colleagues borrowed an early video projector and operated their NLS via a homebrew modem and experimental videoconference links to demonstrate their "augmentation framework." The team's "mother of all demos" received a standing ovation and introduced the world to the future of computing.

Today, Engelbart lives in the Silicon Valley where he directs a technology think tank called the Bootstrap Alliance, dedicated to the core idea that informed, and continues to permeate, all of his work:


"Purposefully investing in improving organizational collective IQ through intelligent improvement strategies promises to yield compound returns. In simple words, the better we get at our collective IQ, the better we'd get at improving our collective IQ."

TECNNOLOGICAL STRESS


There is rapid technological transformation occurring in both work and social life. The results of information technology, such as mobile telephones, computers, and electronic networks, have been looked upon as the key to solving several of the most pressing problems of the Western world. At the same time, numerous studies have shown that the great majority of computerization projects fail to meet their deadlines with the originally specified functionality mainly because human factors are not sufficiently taken into account during the planning and implementation phase of the project. In a study of the bodily, mental, and psychophysiological reactions of employees involved in the design of advanced telecommunications systems and of office employees using regular video display technology, several stress-related psychosomatic disorders have been identified. They include sleep disturbances, psychophysiological stress and somatic complaints. Controlled intervention programs aimed at enhancing organizational structures and individual coping strategies have been proved effective in counteracting the negative effects of working with information technology. The two-way interaction between the external information technology environment and bodily and mental reactions needs to be taken more into account in the design and use of modern information technology. There appears to be an increased awareness of human aspects when the risks and benefits of the rapid spread of information technologies are discussed.

THE HISTORY OF COMPUTERS

The Z1, originally created by Germany's Konrad Zuse in his parents living room in 1936 to 1938 and is considered to be the first electro-mechanical binary programmable (modern) computer and really the first functional computer.

This series covers many of the major milestones in computer history (but not all of them) with a concentration on the history of personal home computers.

Computer History
Year/Enter
Computer History
Inventors/Inventions
Computer History
Description of Event
1936
Konrad Zuse - Z1 Computer
First freely programmable computer.
1942
John Atanasoff & Clifford Berry
ABC Computer
Who was first in the computing biz is not always as easy as ABC.
1944
Howard Aiken & Grace Hopper
Harvard Mark I Computer
The Harvard Mark 1 computer.
1946
John Presper Eckert & John W. Mauchly
ENIAC 1 Computer
20,000 vacuum tubes later...
1948
Frederic Williams & Tom Kilburn
Manchester Baby Computer & The Williams Tube
Baby and the Williams Tube turn on the memories.
1947/48
John Bardeen, Walter Brattain & Wiliam Shockley
The Transistor
No, a transistor is not a computer, but this invention greatly affected the history of computers.
1951
John Presper Eckert & John W. Mauchly
UNIVAC Computer
First commercial computer & able to pick presidential winners.
1953
International Business Machines
IBM 701 EDPM Computer
IBM enters into 'The History of Computers'.
1954
John Backus & IBM
FORTRAN Computer Programming Language
The first successful high level programming language.
1955
(In Use 1959)
Stanford Research Institute, Bank of America, and General Electric
ERMA and MICR
The first bank industry computer - also MICR (magnetic ink character recognition) for reading checks.
1958
Jack Kilby & Robert Noyce
The Integrated Circuit
Otherwise known as 'The Chip'
1962
Steve Russell & MIT
Spacewar Computer Game
The first computer game invented.
1964
Douglas Engelbart
Computer Mouse & Windows
Nicknamed the mouse because the tail came out the end.
1969
ARPAnet
The original Internet.
1970
Intel 1103 Computer Memory
The world's first available dynamic RAM chip.
1971
Faggin, Hoff & Mazor
Intel 4004 Computer Microprocessor
The first microprocessor.
1971
Alan Shugart &IBM
The "Floppy" Disk
Nicknamed the "Floppy" for its flexibility.
1973
Robert Metcalfe & Xerox
The Ethernet Computer Networking
Networking.
1974/75
Scelbi & Mark-8 Altair & IBM 5100 Computers
The first consumer computers.
1976/77
Apple I, II & TRS-80 & Commodore Pet Computers
More first consumer computers.
1978
Dan Bricklin & Bob Frankston
VisiCalc Spreadsheet Software
Any product that pays for itself in two weeks is a surefire winner.
1979
Seymour Rubenstein & Rob Barnaby
WordStar Software
Word Processors.
1981
IBM
The IBM PC - Home Computer
From an "Acorn" grows a personal computer revolution
1981
Microsoft
MS-DOS Computer Operating System
From "Quick And Dirty" comes the operating system of the century.
1983
Apple Lisa Computer
The first home computer with a GUI, graphical user interface.
1984
Apple Macintosh Computer
The more affordable home computer with a GUI.
1985
Microsoft Windows
Microsoft begins the friendly war with Apple.
SERIES
TO BE
CONTINUED

BILL GATES


William (Bill) H. Gates is founder, technology advisor and board member of Microsoft Corporation, the worldwide leader in software, services and solutions that help people and businesses realize their full potential.  He served as chairman of the board until Feb. 4, 2014.

On June 27, 2008, Gates transitioned out of a day-to-day role in the company to spend more time on his global health and education work at the Bill & Melinda Gates Foundation. He shares his thoughts about the foundation and other topics on Gates Notes, a website launched in January 2010. Gates continues to serve on Microsoft’s Board of Directors and as an advisor on key development projects.

Born on Oct. 28, 1955, Gates grew up in Seattle with his two sisters. Their father, William H. Gates II, is a Seattle attorney. Their late mother, Mary Gates, was a schoolteacher, University of Washington regent, and chairwoman of United Way International.

Gates attended public elementary school and the private Lakeside School. There, he discovered his interest in software and began programming computers at age 13.

In 1973, Gates entered Harvard University as a freshman, where he lived down the hall from Steve Ballmer. While at Harvard, Gates developed a version of the programming language BASIC for the first microcomputer - the MITS Altair.

In his junior year, Gates left Harvard to devote his energies to Microsoft, a company he had begun in 1975 with his childhood friend Paul Allen. Guided by a belief that the computer would be a valuable tool on every office desktop and in every home, they began developing software for personal computers. Gates' foresight and his vision for personal computing have been central to the success of Microsoft and the software industry.

Under Gates' leadership, Microsoft's mission has been to continually advance and improve software technology, and to make it easier, more cost-effective and more enjoyable for people to use computers. The company is committed to a long-term view, reflected in its industry-leading investment in research and development each year.

In 1999, Gates wrote "Business @ the Speed of Thought," a book that shows how computer technology can solve business problems in fundamentally new ways. The book was published in 25 languages and is available in more than 60 countries. "Business @ the Speed of Thought" has received wide critical acclaim, and was listed on the best-seller lists of the "New York Times," "USA Today," "The Wall Street Journal" and on Amazon.com. Gates' previous book, "The Road Ahead," published in 1995, was at the top of the "New York Times" bestseller list for seven weeks.

Gates has donated the proceeds of both books to non-profit organizations that support the use of technology in education and skills development.


In addition to his love of computers and software, Gates founded Corbis, which is developing one of the world's largest resources of visual information - a comprehensive digital archive of art and photography from public and private collections around the globe. He is also a member of the board of directors of Berkshire Hathaway Inc., which invests in companies engaged in diverse business activities.

DOES TECHNOLOGY IMPACT CULTURE?

          In today’s technology driven world, people expect to have the means to communicate with others at any given moment. The ability to create relationships based solely on mutual understandings and shared common interests have fed the social media phenomena. In the past, people were able to get together physically and discuss concerns or share thoughts. However public spheres are changing from gathering in coffee shops to meeting online through forums and other social media platforms. As read in Mediated Society – a critical sociology of media, the prospective of critical sociology, the focus is on how media practices impact what we see as normal and affects society’s values. In today’s world, the easy access to technology creates the situation that, when you look around, people are often using smartphones or using their computers to check on what’s happening in the world around them, providing a feeling of connectedness.
          According to Digital Nation, a 90-minute PBS documentary which aired on Feb. 10, 2010, the purpose of the program was: “to examine the risks and possibilities, myths and realities presented by the new digital culture we all inhabit”. One of the many insights from this documentary is that in this wired world, people living in the same house or workplace can all be looking at different screens and communicating with different people. This changes how people interact with each other, as well as where our public spheres may be found (online instead of discussions at the dining room table or in meetings at work, perhaps?). Most concerning to me is the suggestion that multi-tasking online is not to be applauded but to be concerned because of the impact on cognitive abilities.


          Constant communication through use of technology is changing the way people think of themselves and how they communicate. They can get attention, always be heard, and never have to be alone. Connecting electronically can also lead to isolation. They often don’t allow the time to think or listen to each other with the constant sensory stimulus of texts, tweets, Facebook updates, emails and more. Understanding the prospective of critical sociology and how media practices impact what is seen as normal affects society’s values. The ease of connecting through technology and communicating online does have an impact on culture locally and globally as more and more people choose to communicate online instead of in person.