VOICE ONE:
This is SCIENCE IN THE NEWS in VOA Special English. I'm Bob Doughty.
VOICE TWO:
And I'm Faith Lapidus. Today we talk about the past, present and future of
personal computers.
(MUSIC)
VOICE ONE:
You could say the first personal computers were the simple counting devices
of ancient times. But maybe we are going back too far.
Some people would say the first real counting machines were the inventions
of Blaise Pascal in the middle of the 1600s. Pascal was a French
scientist and inventor who designed a machine with wheels and cuts. These worked
together to add or take away numbers.
A few years later, a German scientist improved on Pascal's work and created a
system that permitted multiplication and division.
VOICE TWO:
In the same period, an Englishman named George Boole developed a math system
based on zero and one. Boolean logic was important to the development of the
computers of today.
But we still have some more historical ground to cover.
In eighteen ninety, census workers counted the United States population with
help from a system designed by Herman Hollerith. That system, designed two years
earlier, used two machines. One machine put holes into paper to mark
information. The other machine quickly read the holes and produced a final
count.
VOICE ONE:
Herman Hollerith went on to establish a company
called Tabulating Business Machines. In 1911 he sold the company -- and 13 years later it
became International Business Machines. The company had already been operating
under the IBM name in Canada.
So now we jump from 1924 to 1981 --
August 20th, 1981, to be exact. That was the day the company
announced a new product called the IBM Personal Computer. It was not the first
personal computer ever invented, but its success helped build a new market.
So now we are up to the age of the modern P.C. But we have left out some
steps along the way.
VOICE TWO:
In 1930, the analog computer used gears and shafts to solve
differential equations. Complex mathematics became easier.
Later, IBM's Mark One computer performed operations using a system of
electromechanical switches.
Then in 1946 came the Electronic Numerical Integrator and
Computer -- ENIAC. It used a system of vacuum tubes.
ENIAC was huge. It took up almost 170 square meters in a
building at the University of Pennsylvania. ENIAC was unlike anything before.
Its digital processing was lightning fast, at least compared to older computers.
Analog computers used moving parts. Digital devices process information
electronically in the form of numbers. The difference was like night and day.
(MUSIC)
VOICE ONE:
Until the 1970s, computers were far too big and costly for the
average person. There were mostly mainframe computers in government agencies,
research centers and big companies.
But people found ways to shrink computers, and to increase the power and
speed. Transistors replaced vacuum tubes. Later, integrated circuits combined
many transistors on a single small chip.
The Apple Computer Company in California started selling personal computers
in the late seventies. But the IBM Personal Computer is credited with producing
widespread interest in home computers.
VOICE TWO:
An IBM official called it "the computer for just about anyone
who has ever wanted a personal system at the office, on the university campus
or at home." Many Americans found the price reasonable: about 1,500
dollars.
The success of the IBM Personal Computer helped not only IBM. It also helped
two young companies develop into the industry leaders they are today.
IBM bought the processor, the brain, for its personal computer from
Intel. Intel was then a ten-year-old company. And IBM brought in Microsoft to
provide the programming. Microsoft was then a small, little-known company. Bill
Gates and Paul Allen started Microsoft in 1975.
The IBM Personal Computer came with the first version of Microsoft DOS, or
disk operating system. Today Microsoft operating systems are found on most of
the personal computers in the world.
(MUSIC)
VOICE ONE:
The development of laptop computers meant that people could take them
anywhere. And computers kept getting smaller.
In the nineteen nineties people started talking about PalmPilots and
BlackBerries and other P.D.A.'s -- personal digital assistants. A person could
hold a small computer and, in some cases, a phone and the Internet all in one
hand.
Microprocessors and improved wireless communications led to the age of cell
phones, then cell phones with cameras and more.
Today people can hold in their hand more computing power and speed than the
room-sized mainframes of the past.
(MUSIC)
VOICE TWO:
The IBM Personal Computer started to face growing competition before
long. One was the Apple Macintosh, launched in 1984. What people
really liked was its ease of use. DOS users had to enter written commands. But
Mac users could simply click on icons, little pictures on the screen. Apple had
borrowed the idea from designers at Xerox. Then Microsoft borrowed the idea for
its Windows operating system.
Today Apple still has a loyal following, but those users represent a small
share of the market for personal computers. IBM does not even make personal
computers anymore. It sold that part of its business last year to the Chinese
company Lenovo.
(MUSIC)
VOICE ONE:
So where will the future take the personal computer? Before we
discuss that, we should talk a little about the Internet. It began in the 1960s as
a Defense Department project. It was designed to link researchers around the
United States with a secure way to communicate even in the event of a nuclear
war.
The designers linked together a network of networks, with no point of central
control over the system. That way, messages could get through even if one or
more links were lost. It was built sort of like a spider's web.
The Internet came into popular use in the 1990s. People learned
that "www" meant World Wide Web. Tim Berners-Lee invented this system at the
CERN physics laboratory in Switzerland. Now the public had a new way to send
e-mail, find information and buy goods. Today many people could not live without
it.
As a result, many see the future of the Internet as the future of the
personal computer.
VOICE TWO:
Ray Ozzie is one of those people. He designed Lotus Notes, IBM's widely used
e-mail system and database for linking groups. He was recently named chief
software architect at Microsoft.
VOICE ONE:
Since joining Microsoft last year, Ray Ozzie has tried to make its software
work better with the Internet. Across the industry, programs are increasingly
being designed for use on the Web, instead of being housed on personal
computers. Microsoft is offering more Web-based services.
Last month, at a meeting of financial analysts, Ray Ozzie discussed the
changing times. He described what he called "the P.C. era" as in the past. He
said this is a new period in which the Internet is at the center.
VOICE TWO:
But where is the Internet itself going? There is a lot of talk about
improvements that people say will represent the next version of the Internet.
Internet 2.0, they call it. There are hopes, but at the same time there are
reports of security weaknesses that will need to be fixed.
Also, in recent months there has been a lot of debate about the issue of "Net
neutrality." Internet neutrality basically means that Internet service producers
should not speed up or slow down or block any traffic on the Web. In other
words, all Web content providers should be treated the same.
Telecommunications companies say they spend a lot of money to build systems
that carry Internet traffic. They say they should be able to charge more for
those who use these systems more often, or are willing to pay more for special
services.
(MUSIC)
VOICE ONE:
SCIENCE IN THE NEWS was written by Caty Weaver and produced
by Mario Ritter. Internet users can find transcripts and download archives of
our shows at voaspecialenglish.com.
(來源:VOA
英語點(diǎn)津meggie編輯)