Sunday, April 4, 2010

Apple's 'Revolutionary' iPad Goes on Sale in U.S.A

Apple's 'Revolutionary' iPad Goes on Sale in America

Apple is calling its new tablet computer, the iPad, a "revolution." At electronics stores across the United States, customers lined up before the sun rose April 3, 2010 to get their hands on one. But is the iPad really going to be the future of computing technology?

The sun had barely risen, but the line outside of the Apple store in Bethesda, Maryland already stretched around the block. Apple fans and enthusiasts were eager to finally purchase an iPad, a tablet computer with a touch-screen that Apple claims will "bridge the gap" between a cell phone and a laptop computer.

Ryan Brown with his new iPad outside the Apple store in Bethesda, Maryland on April 3, 2010.

Ryan Brown was first in line at the Maryland store. He said he believes that the iPad is the "next generation" of computing technology. "I think anything hands-on is going to get [you] really, really into the technology. If you can just manipulate anything with your hands and be a part of it, it makes it a lot more personal," he said.

Matthew Carney said the iPad is a sign of things to come. "I think a lot of the advancements that will come from it, will definitely play a larger role than we probably think now. I don't think it will change everything overnight, don't get me wrong. Will we have more touch-control or touch-interface or that kind of thing? Yes," he said.

The iPad has a touch-screen that is just under 25 centimeters in diagonal. It does not have a mechanical keyboard and weighs less than one kilogram. It allows users to surf the Web, watch video, listen to music, play games or read electronic versions of books. Computer programmers already have created thousands of applications for the iPad that will allow customers to perform many other tasks previously done on regular laptops or on mobile phones.

Still, not everyone is sold on the new device. "Now, is the iPad itself a really great and elegant and beautiful device? Yes! It turns out that it is," said Industry analyst James McQuivey.

But that's not enough to start the revolution that they claim to be starting and that's I think my principal disappointment here. It doesn't give you enough content. It doesn't give you most of the things you want to do online, for example -- Hulu.com, Netflix, because it doesn't support Flash. These are things that Apple has chosen to do that I think hamper its ability to have the impact that it really could have otherwise," he added.

Regardless of whether or not the iPad is the next generation of computing technology, Ryan Brown is just glad to finally have one of his own. "I'm really excited to have it. I've been waiting forever for it," he said.

The iPad's worldwide release is expected later this month.

Whole Earth' goes digital



TechMan was rooting around in the basement the other day and came across a hippie era stash of -- no, not that -- Whole Earth Catalogs and magazines.

Anyone coming of age in the '60s, '70s or '80s remembers those publications, favorites of the back-to-the-earth, counterculture movement.

But what you may not remember was the role of the publications and their founder in early days of the personal computer.

Stewart Brand, who began the catalogs (they were published in various forms from 1968 to 1988), had impressive counterculture credentials.

In 1966, he campaigned to have NASA release the rumored satellite photo of the entire Earth as seen from space. He thought it would be a uniting image. In 1968, a NASA astronaut made the photo and it appeared on the cover of the first catalog.

Mr. Brand took part in early scientific studies of then-legal LSD and was one of Ken Kesey's iconic Merry Pranksters. Tom Wolfe describes him in the first chapter of "The Electric Kool-Aid Acid Test."

He also had an early interest in computers and assisted Douglas Englebart with his famous presentation to the 1968 Fall Joint Computer Conference in San Francisco that debuted many revolutionary computer technologies, including the mouse.

Mr. Brand went on to team up with Larry Brilliant to found one of the seminal dial-up bulletin boards and computer-based communities, the Whole Earth 'Lectronic Link or The Well.

The Well became a gathering place for computer types, including Mitch Kapor, designer of Lotus 1-2-3, and Grateful Dead lyricist John Perry Barlow. Mr. Kapor and Mr. Barlow joined with John Gilmore to establish the Electronic Frontier Foundation, still a leading defender of digital individual rights.

Craig Newmark started his original Craigslist on The Well. And it was a major online meeting place for Deadheads.

The initial 1968 Whole Earth Catalog, between the entries for beekeeping equipment, had an article about a personal computer. It was $5,000, the most expensive thing in the book.

In 1984, Mr. Brand published the Whole Earth Software Catalog 1.0 and a magazine called the Whole Earth Software Review, which lasted only three issues. But the software catalog was updated to 2.0 in 1985.

In 1989, the Electronic Whole Earth Catalog was published on CD-ROM using an early version of hypertext linking language

Mr. Brand and his cohorts at Whole Earth convened the first "Hacker's Conference," which, after a sputtering start, has become an annual event.

Wrote John Markoff of the New York Times: "There was a very big Luddite component to counterculture back then, but Brand wanted nothing to do with it. ... He really believed you could take the tools of the establishment and use them for grass-roots purposes. He was, after all, the guy who coined the term personal computer."

In 1995, Mr. Brand wrote an article in Time magazine titled "We Owe It All to the Hippies." In it he said, "Counterculture's scorn for centralized authority provided the philosophical foundations of not only the leaderless Internet but also the entire personal-computer revolution."

So are Whole Earth Catalogs now consigned the basement's of ex-hippies and the ideas in them only tie-dyed memories?

No. Stewart Brand and his catalogs live on, although much of what seemed radical now is mainstream.

Contacted by e-mail through the Long Now Foundation, Mr. Brand said he had not changed his ideas on the use of computers for grass-roots purposes or on the continuing influence of the countrculture on computers and the Internet.

After the Whole Earth Catalog and its descendants ceased publication, New Whole Earth LLC, headed by entrepreneur and philanthropist Samuel B. Davis, acquired the intellectual property and physical assets.

As it states on the wholeearth.com website, "Although the catalog's heyday was during a specific and turbulent period of American history, the ideas found in it and in its related publications continue to engage the brightest minds of the 21st century -- and Whole Earth LLC believes that those ideas should be preserved as they were originally disseminated."

To that end, at wholeearth.com, all of the catalogs and other publications are free for viewing online. Or you can buy a pdf copy for $5 or less.

PC Pioneer, Ed Roberts

In the early history of the personal computer revolution, Dr. Edward Roberts is certainly a pivotal figure. In fact, he is considered the inventor of the PC.

Unfortunately, he died this week of pneumonia. He was 68.

Back in the mid 1970s, Roberts launched the MITS Altair computer. While it had limited capability, it still was a breakthrough.

Interestingly enough, it caught the attention of Bill Gates and Paul Allen (when they read an issue of Popular Electronics, which had a feature on the Altair). Perhaps there was an opportunity to develop software for this new machine?

So Gates and Allen moved to Albuquerque to be closer to MITS and started Microsoft (MSFT). The first product was called BASIC, which was a language to program computers.

While Roberts did well, he would sell MITS in 1977 and even agreed to stay out of the computer business for five years. He then went on to become a physician. It was actually his lifelong dream.

In appreciation of Roberts, Gates visited him last week at his death bed and also sent out a joint press release with Allen. In it, they say: "Ed was willing to take a chance on us -- two young guys interested in computers long before they were commonplace -- and we have always been grateful to him. The day our first untested software worked on his Altair was the start of a lot of great things. We will always have many fond memories of working with Ed."

Tom Taulli advises on business tax preparation and is also the author of a variety of books, including The Complete M&A Handbook. His website is at Taulli.com.

The Computer Revolution


Dell Desktop PC Best Holiday,Christmas Gift image001.jpg


The first computer was conceptualized by a 19th century British mathematician named Charles Babbage -- creator of the speedometer -- whose Analytical Engine, a programmable logic center with memory, was never built even though Babbage worked on it for 40 years. During World War II, the British built Colossus I, a computer designed to break Nazi military codes, while at Harvard, IBM's Mark I computer was assembled. In 1946, ENIAC (Electronic Numerical Integrator and Calculator), was put to work at the University of Pennsylvania. Designed to perform artillery firing calculations, ENIAC was America's first digital electronic computer. Unlike the Mark I, which used electromechanical relays, ENIAC used vacuum tubes -- 18,000 of them, in fact -- and was much faster. The Mark I, Colossus and ENIAC used the binary system, which reduced the counting process to two digits, 0 and 1. This was compatible with the basis of computer operation, combinations of on-off (yes-no) switches called logic gates that responded to electrical charges and produced information called bits. (A cluster of eight bits was called a byte.) In 1947, Bell Labs invented the transistor, which replaced the vacuum tube as a faster, smaller conduit of electrical current, and a few years later the Labs produced Leprechaun, the world's first fully transistorized computer. One of the transistor's inventors, William Shockley, left Bell Labs and started his own firm in Palo Alto, California -- the center of an area that would soon be known as Silicon Valley. Shockley (among others) discovered that any number of interconnected transistors or circuits could be placed on a microchip of silicon. One drawback of the microchip -- that it was "hard-wired" and thus could perform only the duties for which it was designed -- was resolved by Intel Corporation's invention of the microprocessor. The microprocessor could program a single chip to perform a variety of tasks. These technological advances made computers smaller, faster and more affordable, paving the way for the advent of the personal computer.
The first personal computer was the Altair 8800, which briefly appeared on the scene in 1975. Two years later, the Apple II was unveiled. According to Time, it was "the machine that made the revolution," and was the offspring of Steven Jobs and Steven Wozniak. The latter was Apple's star designer, while Jobs used uncanny marketing skills to build Apple into a highly profitable concern with stock worth $1.7 billion by 1983. The personal computer did not long remain the private domain of Apple and one or two other upstart companies. In 1981, IBM introduced its PC, having previously focused its efforts on manufacturing mainframe business computers. Incorporating the Intel microprocessor, the IBM PC set the standard for quality. That same year, Adam Osborne, a Bangkok-born British columnist, introduced a 24-pound portable computer with detachable keyboard, 64K of memory and a diminutive five-inch screen, which retailed for $1,795. The progenitor of the laptop, the Osborne I was so successful that a host of imitators quickly followed. Even more compact than the Osborne, Clive Sinclair's 12-ounce ZX80, marketed in 1980 (as the Timex Sinclair 1000 in the U.S.) introduced many people to the possibilities of computers, in spite of its limitations, thanks to a $99 list price. One of the most popular and least expensive personal computers of the early Eighties was the $595 Commodore 64.
Concerned were expressed about the consequences of the computer revolution. One argument was that computers would widen the gap between the "haves" and the "have-nots," as it seemed that only those with good educations and lots of discretionary income could afford to plug into the "information network" as represented, early in the decade, by nearly 1,500 databases like Source, a Reader's Digest subsidiary, a legal database called Westlaw, and the American Medical Association's AMA/NET. Others worried that people would come to rely too much on computers to do what they had once done in their own heads, namely to remember and analyze, therefore making a lot of learning unnecessary. Still others feared that the computer revolution would result in an increasingly isolated populace; when people worked at home, networking with their company via computer, they would lose the personal contact that made the workplace an essential element in the social fabric of the community. There was fear, as well, that the computerization of industry would cost workers their jobs. Futurists like Alvin Toppler, author of the 1980 bestseller The Third Wave, envisioned a not-too-distant future when an entire family would learn, work and play around an "electronic hearth" -- the computer -- at home.
Proponents countered that the computer would prove to be a tremendous boon to humanity. The salutary effect of computers in the field of medicine was considerable, providing more precise measurements and monitoring in such procedures as surgical anethesia, blood testing and intravenous injections. Computers promised widened employment horizons for 10 million physically handicapped Americans, many of whom were unable to commute back and forth to work. And a majority of Americans believed computers were an excellent educational tool. A 1982 Yankelovich poll showed Americans were generally sanguine about the dawning Computer Age; 68 percent thought computers would improve their childrens' education, 80 percent expected computers to become as commonplace as televisions in the home, and 67 percent believed they would raise living standards through enhanced productivity. As a consequence, 2.8 million computers were sold in the U.S. in 1982, up from 1.4 million in 1981, which was in turn double the number sold in 1980. In 1982, 100 companies shared the $5 billion in sales, with Texas Instruments, Apple, IBM, Commodore, Timex and Atari the frontrunners in the personal computer market. And by 1982 there were more than 100,000 computers in the nation's public schools. Educators were delighted to report that students exposed to computers studied more and were more proficient in problem solving. Kids joined computer clubs and attended summer computer camps. In lieu of naming a "Man of the Year" for 1982, Time named the computer the "Machine of the Year."
By mid-decade the computer craze was at full throttle. On Wall Street, 36 computer-related companies had gone public with $798 million worth of stock. By 1984 retail sales of personal computers and software had reached $15 billion. A flurry of entrepreneurship in the so-called computer "aftermarket" produced $1.4 billion in sales of such commodities as computer training and tutoring services, specialty furniture, and publishing. In the latter field there were 4,200 books on personal computing in print (compared to only 500 in 1980), as well as 300 magazines, including Byte and Popular Computing. In addition, thousands of software developers competed fiercely for their share of sales that exceeded $2 billion annually by 1984. Another $11 billion was made leasing mainframe software to banks, airlines and the government. The biggest software manufacturer was Microsoft, with 1984 revenues of about $100 million. Systems software, like Microsoft's MS-DOS and AT&T's UNIX instructed the various elements of a computer system to work in unison. Another popular software program was Lotus 1-2-3, a spreadsheet plus electronic filing system popular in the business market. Software quality ranged from outstanding to awful, as did prices. And software errors delated the release of Apple's Macintosh computer by two years. Software pirating became a big business, as well. It was estimated that as many as 20 pirated copies of a program were peddled for every one legitimate purchase.
The public sector embraced the revolution. The Grace Commission -- the presidential review of waste in government -- criticized what it saw as a failure to seize the opportunities presented by the new technologies, and was followed by a five-year plan, launched in 1983, to increase government computer expenditures. The goal, according to one General Services Administration official, was to put "a million computers in the hands of managers as well as those who need them daily for such things as air traffic control, customs, passports and Social Security." The White House itself became computerized, with 150 terminals linked to a powerful central system. Cabinet members like Treasury Secretary Donald Regan carried desktop units with them everywhere. An email network linked the administration with 22 federal agencies. The Library of Congress began copying its more valuable collections onto digital-optical disks for public retrieval. The National Library of Medicine at Bethesda, Maryland stored electronic copies of five million medical books and articles in a single database. In 1984 the Internal Revenue Service began using optical scanners to process 18 million 1040EZ tax forms into a computer system. And the FBI's Organized Crime Information System, a $7 million computer, began compiling data on thousands of criminals and suspects.
By 1988, computer viruses had become a major concern. In that year alone, over 250,000 computers were infected in a nine-month period, raising grave doubts about the vulnerability of data systems. In September of that year the first computer virus criminal trial was conducted in Fort Worth, Texas. The plaintiff, a disgruntled ex-employee, was accused of infecting his former company's computer with a virus that deleted 168,000 sales commission records. Like their biological counterparts, such viruses were designed to reproduce perfect replicas of themselves to infect software that came into contact with the host computer. And there was no telling where a virus would strike. One virus, produced by the owners of a computer store in Lahore, Pakistan, managed to infect 10,000 IBM PC disks at George Washington University in the U.S. The SCORES virus spread from a Dallas-based computer services company to Boeing, NASA, the IRS and the House of Representatives. There was concern that one day a "killer virus" would find its way int the nation's electronic funds-transfer system, crash the stock exchange computer centers, or scramble air-traffic control systems. In response, 48 states quickly passed computer mischief laws, while companies and computer users scrambled to create antiviral programs.
Clearly, the computer revolution of the 1980s opened up a new frontier, one that Americans, true to their pioneer traditions, were eager to explore. Unanticipated perils lay in wait, but the possibilities in terms of enhancing our lives were limitless.

Supercomputers
As the end of the decade approached, supercomputers -- costing between $5 million and $25 million each -- were being used to locate new oil deposits, create spectacular Hollywood special effects, and design new military weapons, not to mention artificial limbs, jet engines, and a host of other products for private industry. These machines were able to crunch data at a speed measured in gigaFLOPS -- billions of operations per second. In size, most were no larger than a hot tub. The National Science Foundation established five supercomputer centers which by 1988 were linked to 200 universities and research labs. The Los Alamos National Laboratory utilized eleven supercomputers. In 1988, IBM financed a machine incorporating parallel-processing, the use of 64 processors in tandem, which would make it 100 times faster than supercomputers then in use. At the same time, IBM was working on the TF-1, consisting of 33,000 high-speed processing units -- a computer that would be 20,000 times faster than anything on the market.

prototype computer mouse

In 1964, the first prototype computer mouse was made to use with a graphical user interface. Douglas Engelbart's computer mouse received patent # 3,541,541 on November 17, 1970 for a "X-Y Position Indicator For A Display System"

computer mouse - original patent

HP CALCULATORS

NOW THREE CALCULATOR

POSTERS ARE FOR SALE!!!

HP  Poster


The Calculator Reference is proud to now offer THREE fabulous calculator posters. The wildly popular Curta Calculator poster, the Calculators of HP poster and the new Curta Calculator poster printed in the German words of the Great Master, Curt Herzstark. Click on any picture to learn more.... -Rick-

NEW HP CALCULATORS!

Hewlett  Packard Publications Hewlett Packard Publications Hewlett  Packard Publications

How the Personal Computer Was Born

How the Personal Computer Was Born

The Smithsonian Institution's "Information Age" exhibit in Washington has long had a section devoted to the early days of personal computing. One of the most prominently displayed early computers in the exhibit is a rare pre-production model of the famous Altair 8800. How that computer got from my Texas office to the Smithsonian Institution is a story worth telling.

The Altair computer sparked the personal computing revolution when this magazine appeared on newsstands nearly three decades ago. Photograph by Forrest M. Mims III. Click image to enlarge.

Twenty-five years ago Popular Electronics magazine featured on its cover a photograph of a build-it-yourself computer called the Altair 8800. An accompanying article by Ed Roberts described how to build the computer. You could even order a complete kit of parts for a little less than $400 from MITS, Inc., Roberts's company in Albuquerque, New Mexico.

Bill Gates and Paul Allen were so excited by the Altair that they left Harvard and moved to Albuquerque. There they formed a tiny company called Microsoft and worked alongside Roberts at MITS to develop software for the Altair. It was 1975, and no one then could possibly have imagined that Bill Gates, an 18-year old college dropout who never slept and who always needed a haircut, would one day become the richest man on Earth.

At a time when business computers sold for tens of thousands of dollars, the Altair was a major breakthrough. Roberts and Gates understood that better than anyone else. Yet by today's standards, the Altair was unbelievably primitive. Instead of a keyboard, information was entered into a clumsy row of toggle switches. The Altair's "monitor screen" was a row of flashing red lights. You had to understand the binary number system to know what they meant.

That meant I had to learn binary, for Roberts asked me to write the Altair operator's manual. Since MITS was near bankruptcy, my fee would be one of the first Altairs. That Altair is the one that has been displayed at the Smithsonian for the past dozen or so years. In next week's column, I'll tell that story. But first, I want to tell you more about Ed Roberts and MITS, the company he and I began.

Roberts and I became friends in 1968 when we were assigned to the Air Force's state-of-the-art laser laboratory at Kirtland Air Force Base in Albuquerque, New Mexico. Roberts was a second lieutenant who had entered the service as an enlisted man and was commissioned when he received his college degree. I was a first lieutenant just back from a year as an intelligence officer in Vietnam.

While working at the laser lab, Roberts and I had long discussions about our common interests in science and electronics. Roberts used to say that his goals were to earn a million dollars before he reached 30, learn to fly, become a medical doctor and move to a farm in Georgia. Roberts also said that he wanted to design an inexpensive digital computer. None of us then realized that the invention of the computer-on-a-chip known as the microprocessor would eventually allow all of Ed Roberts's dreams to come true.

In 1969 I began writing articles on model rocketry and hobby electronics for various magazines. Being an entrepreneur at heart, Roberts noticed this and wondered if we could form a company to sell kits based on the projects I wrote about. We soon held a meeting in Ed's kitchen with two other guys from the lab, Stan Cagle and Bob Zaller. Within a few weeks we had formed Micro Instrumentation and Telemetry Systems Inc. or simply MITS.

My job was to write magazine articles about how to build and use our products. My wife Minnie Minnie and Robert's wife Joan helped package the kits in blue plastic boxes. Our first product was a tiny light flasher I had designed for tracking and recovering test flights of guided model rockets at night.

The flasher circuit was based on a basic code-practice oscillator from Radio Shack that I had modified to send pulses to an infrared-emitting diode used in a miniature travel aid for the blind that I designed in 1966.

We sold a few hundred light flasher kits, but it soon became obvious that MITS needed a bigger market. So I wrote an article about how to build a device that would transmit your voice over a beam of invisible light. Popular Electronics magazine made the project one of their 1970 cover stories, and we sold a hundred or so kits. With this money we began work on a laser system for hobbyists, which was also published in Popular Electronics. Most of our customers were hobbyists, but we also received orders from universities, corporations and even the FBI,

The kit business eventually reached a break-even point. But MITS needed a product with much more appeal. Next time I'll tell you about that product and how it led directly to the Altair 8800.

Forgotten critics said the Altair was simple and plain. But Bill Gates looked inside and saw an electronic brain.