Sunday, April 4, 2010

Apple's 'Revolutionary' iPad Goes on Sale in U.S.A

Apple's 'Revolutionary' iPad Goes on Sale in America

Apple is calling its new tablet computer, the iPad, a "revolution." At electronics stores across the United States, customers lined up before the sun rose April 3, 2010 to get their hands on one. But is the iPad really going to be the future of computing technology?

The sun had barely risen, but the line outside of the Apple store in Bethesda, Maryland already stretched around the block. Apple fans and enthusiasts were eager to finally purchase an iPad, a tablet computer with a touch-screen that Apple claims will "bridge the gap" between a cell phone and a laptop computer.

Ryan Brown with his new iPad outside the Apple store in Bethesda, Maryland on April 3, 2010.

Ryan Brown was first in line at the Maryland store. He said he believes that the iPad is the "next generation" of computing technology. "I think anything hands-on is going to get [you] really, really into the technology. If you can just manipulate anything with your hands and be a part of it, it makes it a lot more personal," he said.

Matthew Carney said the iPad is a sign of things to come. "I think a lot of the advancements that will come from it, will definitely play a larger role than we probably think now. I don't think it will change everything overnight, don't get me wrong. Will we have more touch-control or touch-interface or that kind of thing? Yes," he said.

The iPad has a touch-screen that is just under 25 centimeters in diagonal. It does not have a mechanical keyboard and weighs less than one kilogram. It allows users to surf the Web, watch video, listen to music, play games or read electronic versions of books. Computer programmers already have created thousands of applications for the iPad that will allow customers to perform many other tasks previously done on regular laptops or on mobile phones.

Still, not everyone is sold on the new device. "Now, is the iPad itself a really great and elegant and beautiful device? Yes! It turns out that it is," said Industry analyst James McQuivey.

But that's not enough to start the revolution that they claim to be starting and that's I think my principal disappointment here. It doesn't give you enough content. It doesn't give you most of the things you want to do online, for example -- Hulu.com, Netflix, because it doesn't support Flash. These are things that Apple has chosen to do that I think hamper its ability to have the impact that it really could have otherwise," he added.

Regardless of whether or not the iPad is the next generation of computing technology, Ryan Brown is just glad to finally have one of his own. "I'm really excited to have it. I've been waiting forever for it," he said.

The iPad's worldwide release is expected later this month.

Whole Earth' goes digital



TechMan was rooting around in the basement the other day and came across a hippie era stash of -- no, not that -- Whole Earth Catalogs and magazines.

Anyone coming of age in the '60s, '70s or '80s remembers those publications, favorites of the back-to-the-earth, counterculture movement.

But what you may not remember was the role of the publications and their founder in early days of the personal computer.

Stewart Brand, who began the catalogs (they were published in various forms from 1968 to 1988), had impressive counterculture credentials.

In 1966, he campaigned to have NASA release the rumored satellite photo of the entire Earth as seen from space. He thought it would be a uniting image. In 1968, a NASA astronaut made the photo and it appeared on the cover of the first catalog.

Mr. Brand took part in early scientific studies of then-legal LSD and was one of Ken Kesey's iconic Merry Pranksters. Tom Wolfe describes him in the first chapter of "The Electric Kool-Aid Acid Test."

He also had an early interest in computers and assisted Douglas Englebart with his famous presentation to the 1968 Fall Joint Computer Conference in San Francisco that debuted many revolutionary computer technologies, including the mouse.

Mr. Brand went on to team up with Larry Brilliant to found one of the seminal dial-up bulletin boards and computer-based communities, the Whole Earth 'Lectronic Link or The Well.

The Well became a gathering place for computer types, including Mitch Kapor, designer of Lotus 1-2-3, and Grateful Dead lyricist John Perry Barlow. Mr. Kapor and Mr. Barlow joined with John Gilmore to establish the Electronic Frontier Foundation, still a leading defender of digital individual rights.

Craig Newmark started his original Craigslist on The Well. And it was a major online meeting place for Deadheads.

The initial 1968 Whole Earth Catalog, between the entries for beekeeping equipment, had an article about a personal computer. It was $5,000, the most expensive thing in the book.

In 1984, Mr. Brand published the Whole Earth Software Catalog 1.0 and a magazine called the Whole Earth Software Review, which lasted only three issues. But the software catalog was updated to 2.0 in 1985.

In 1989, the Electronic Whole Earth Catalog was published on CD-ROM using an early version of hypertext linking language

Mr. Brand and his cohorts at Whole Earth convened the first "Hacker's Conference," which, after a sputtering start, has become an annual event.

Wrote John Markoff of the New York Times: "There was a very big Luddite component to counterculture back then, but Brand wanted nothing to do with it. ... He really believed you could take the tools of the establishment and use them for grass-roots purposes. He was, after all, the guy who coined the term personal computer."

In 1995, Mr. Brand wrote an article in Time magazine titled "We Owe It All to the Hippies." In it he said, "Counterculture's scorn for centralized authority provided the philosophical foundations of not only the leaderless Internet but also the entire personal-computer revolution."

So are Whole Earth Catalogs now consigned the basement's of ex-hippies and the ideas in them only tie-dyed memories?

No. Stewart Brand and his catalogs live on, although much of what seemed radical now is mainstream.

Contacted by e-mail through the Long Now Foundation, Mr. Brand said he had not changed his ideas on the use of computers for grass-roots purposes or on the continuing influence of the countrculture on computers and the Internet.

After the Whole Earth Catalog and its descendants ceased publication, New Whole Earth LLC, headed by entrepreneur and philanthropist Samuel B. Davis, acquired the intellectual property and physical assets.

As it states on the wholeearth.com website, "Although the catalog's heyday was during a specific and turbulent period of American history, the ideas found in it and in its related publications continue to engage the brightest minds of the 21st century -- and Whole Earth LLC believes that those ideas should be preserved as they were originally disseminated."

To that end, at wholeearth.com, all of the catalogs and other publications are free for viewing online. Or you can buy a pdf copy for $5 or less.

PC Pioneer, Ed Roberts

In the early history of the personal computer revolution, Dr. Edward Roberts is certainly a pivotal figure. In fact, he is considered the inventor of the PC.

Unfortunately, he died this week of pneumonia. He was 68.

Back in the mid 1970s, Roberts launched the MITS Altair computer. While it had limited capability, it still was a breakthrough.

Interestingly enough, it caught the attention of Bill Gates and Paul Allen (when they read an issue of Popular Electronics, which had a feature on the Altair). Perhaps there was an opportunity to develop software for this new machine?

So Gates and Allen moved to Albuquerque to be closer to MITS and started Microsoft (MSFT). The first product was called BASIC, which was a language to program computers.

While Roberts did well, he would sell MITS in 1977 and even agreed to stay out of the computer business for five years. He then went on to become a physician. It was actually his lifelong dream.

In appreciation of Roberts, Gates visited him last week at his death bed and also sent out a joint press release with Allen. In it, they say: "Ed was willing to take a chance on us -- two young guys interested in computers long before they were commonplace -- and we have always been grateful to him. The day our first untested software worked on his Altair was the start of a lot of great things. We will always have many fond memories of working with Ed."

Tom Taulli advises on business tax preparation and is also the author of a variety of books, including The Complete M&A Handbook. His website is at Taulli.com.

The Computer Revolution


Dell Desktop PC Best Holiday,Christmas Gift image001.jpg


The first computer was conceptualized by a 19th century British mathematician named Charles Babbage -- creator of the speedometer -- whose Analytical Engine, a programmable logic center with memory, was never built even though Babbage worked on it for 40 years. During World War II, the British built Colossus I, a computer designed to break Nazi military codes, while at Harvard, IBM's Mark I computer was assembled. In 1946, ENIAC (Electronic Numerical Integrator and Calculator), was put to work at the University of Pennsylvania. Designed to perform artillery firing calculations, ENIAC was America's first digital electronic computer. Unlike the Mark I, which used electromechanical relays, ENIAC used vacuum tubes -- 18,000 of them, in fact -- and was much faster. The Mark I, Colossus and ENIAC used the binary system, which reduced the counting process to two digits, 0 and 1. This was compatible with the basis of computer operation, combinations of on-off (yes-no) switches called logic gates that responded to electrical charges and produced information called bits. (A cluster of eight bits was called a byte.) In 1947, Bell Labs invented the transistor, which replaced the vacuum tube as a faster, smaller conduit of electrical current, and a few years later the Labs produced Leprechaun, the world's first fully transistorized computer. One of the transistor's inventors, William Shockley, left Bell Labs and started his own firm in Palo Alto, California -- the center of an area that would soon be known as Silicon Valley. Shockley (among others) discovered that any number of interconnected transistors or circuits could be placed on a microchip of silicon. One drawback of the microchip -- that it was "hard-wired" and thus could perform only the duties for which it was designed -- was resolved by Intel Corporation's invention of the microprocessor. The microprocessor could program a single chip to perform a variety of tasks. These technological advances made computers smaller, faster and more affordable, paving the way for the advent of the personal computer.
The first personal computer was the Altair 8800, which briefly appeared on the scene in 1975. Two years later, the Apple II was unveiled. According to Time, it was "the machine that made the revolution," and was the offspring of Steven Jobs and Steven Wozniak. The latter was Apple's star designer, while Jobs used uncanny marketing skills to build Apple into a highly profitable concern with stock worth $1.7 billion by 1983. The personal computer did not long remain the private domain of Apple and one or two other upstart companies. In 1981, IBM introduced its PC, having previously focused its efforts on manufacturing mainframe business computers. Incorporating the Intel microprocessor, the IBM PC set the standard for quality. That same year, Adam Osborne, a Bangkok-born British columnist, introduced a 24-pound portable computer with detachable keyboard, 64K of memory and a diminutive five-inch screen, which retailed for $1,795. The progenitor of the laptop, the Osborne I was so successful that a host of imitators quickly followed. Even more compact than the Osborne, Clive Sinclair's 12-ounce ZX80, marketed in 1980 (as the Timex Sinclair 1000 in the U.S.) introduced many people to the possibilities of computers, in spite of its limitations, thanks to a $99 list price. One of the most popular and least expensive personal computers of the early Eighties was the $595 Commodore 64.
Concerned were expressed about the consequences of the computer revolution. One argument was that computers would widen the gap between the "haves" and the "have-nots," as it seemed that only those with good educations and lots of discretionary income could afford to plug into the "information network" as represented, early in the decade, by nearly 1,500 databases like Source, a Reader's Digest subsidiary, a legal database called Westlaw, and the American Medical Association's AMA/NET. Others worried that people would come to rely too much on computers to do what they had once done in their own heads, namely to remember and analyze, therefore making a lot of learning unnecessary. Still others feared that the computer revolution would result in an increasingly isolated populace; when people worked at home, networking with their company via computer, they would lose the personal contact that made the workplace an essential element in the social fabric of the community. There was fear, as well, that the computerization of industry would cost workers their jobs. Futurists like Alvin Toppler, author of the 1980 bestseller The Third Wave, envisioned a not-too-distant future when an entire family would learn, work and play around an "electronic hearth" -- the computer -- at home.
Proponents countered that the computer would prove to be a tremendous boon to humanity. The salutary effect of computers in the field of medicine was considerable, providing more precise measurements and monitoring in such procedures as surgical anethesia, blood testing and intravenous injections. Computers promised widened employment horizons for 10 million physically handicapped Americans, many of whom were unable to commute back and forth to work. And a majority of Americans believed computers were an excellent educational tool. A 1982 Yankelovich poll showed Americans were generally sanguine about the dawning Computer Age; 68 percent thought computers would improve their childrens' education, 80 percent expected computers to become as commonplace as televisions in the home, and 67 percent believed they would raise living standards through enhanced productivity. As a consequence, 2.8 million computers were sold in the U.S. in 1982, up from 1.4 million in 1981, which was in turn double the number sold in 1980. In 1982, 100 companies shared the $5 billion in sales, with Texas Instruments, Apple, IBM, Commodore, Timex and Atari the frontrunners in the personal computer market. And by 1982 there were more than 100,000 computers in the nation's public schools. Educators were delighted to report that students exposed to computers studied more and were more proficient in problem solving. Kids joined computer clubs and attended summer computer camps. In lieu of naming a "Man of the Year" for 1982, Time named the computer the "Machine of the Year."
By mid-decade the computer craze was at full throttle. On Wall Street, 36 computer-related companies had gone public with $798 million worth of stock. By 1984 retail sales of personal computers and software had reached $15 billion. A flurry of entrepreneurship in the so-called computer "aftermarket" produced $1.4 billion in sales of such commodities as computer training and tutoring services, specialty furniture, and publishing. In the latter field there were 4,200 books on personal computing in print (compared to only 500 in 1980), as well as 300 magazines, including Byte and Popular Computing. In addition, thousands of software developers competed fiercely for their share of sales that exceeded $2 billion annually by 1984. Another $11 billion was made leasing mainframe software to banks, airlines and the government. The biggest software manufacturer was Microsoft, with 1984 revenues of about $100 million. Systems software, like Microsoft's MS-DOS and AT&T's UNIX instructed the various elements of a computer system to work in unison. Another popular software program was Lotus 1-2-3, a spreadsheet plus electronic filing system popular in the business market. Software quality ranged from outstanding to awful, as did prices. And software errors delated the release of Apple's Macintosh computer by two years. Software pirating became a big business, as well. It was estimated that as many as 20 pirated copies of a program were peddled for every one legitimate purchase.
The public sector embraced the revolution. The Grace Commission -- the presidential review of waste in government -- criticized what it saw as a failure to seize the opportunities presented by the new technologies, and was followed by a five-year plan, launched in 1983, to increase government computer expenditures. The goal, according to one General Services Administration official, was to put "a million computers in the hands of managers as well as those who need them daily for such things as air traffic control, customs, passports and Social Security." The White House itself became computerized, with 150 terminals linked to a powerful central system. Cabinet members like Treasury Secretary Donald Regan carried desktop units with them everywhere. An email network linked the administration with 22 federal agencies. The Library of Congress began copying its more valuable collections onto digital-optical disks for public retrieval. The National Library of Medicine at Bethesda, Maryland stored electronic copies of five million medical books and articles in a single database. In 1984 the Internal Revenue Service began using optical scanners to process 18 million 1040EZ tax forms into a computer system. And the FBI's Organized Crime Information System, a $7 million computer, began compiling data on thousands of criminals and suspects.
By 1988, computer viruses had become a major concern. In that year alone, over 250,000 computers were infected in a nine-month period, raising grave doubts about the vulnerability of data systems. In September of that year the first computer virus criminal trial was conducted in Fort Worth, Texas. The plaintiff, a disgruntled ex-employee, was accused of infecting his former company's computer with a virus that deleted 168,000 sales commission records. Like their biological counterparts, such viruses were designed to reproduce perfect replicas of themselves to infect software that came into contact with the host computer. And there was no telling where a virus would strike. One virus, produced by the owners of a computer store in Lahore, Pakistan, managed to infect 10,000 IBM PC disks at George Washington University in the U.S. The SCORES virus spread from a Dallas-based computer services company to Boeing, NASA, the IRS and the House of Representatives. There was concern that one day a "killer virus" would find its way int the nation's electronic funds-transfer system, crash the stock exchange computer centers, or scramble air-traffic control systems. In response, 48 states quickly passed computer mischief laws, while companies and computer users scrambled to create antiviral programs.
Clearly, the computer revolution of the 1980s opened up a new frontier, one that Americans, true to their pioneer traditions, were eager to explore. Unanticipated perils lay in wait, but the possibilities in terms of enhancing our lives were limitless.

Supercomputers
As the end of the decade approached, supercomputers -- costing between $5 million and $25 million each -- were being used to locate new oil deposits, create spectacular Hollywood special effects, and design new military weapons, not to mention artificial limbs, jet engines, and a host of other products for private industry. These machines were able to crunch data at a speed measured in gigaFLOPS -- billions of operations per second. In size, most were no larger than a hot tub. The National Science Foundation established five supercomputer centers which by 1988 were linked to 200 universities and research labs. The Los Alamos National Laboratory utilized eleven supercomputers. In 1988, IBM financed a machine incorporating parallel-processing, the use of 64 processors in tandem, which would make it 100 times faster than supercomputers then in use. At the same time, IBM was working on the TF-1, consisting of 33,000 high-speed processing units -- a computer that would be 20,000 times faster than anything on the market.

prototype computer mouse

In 1964, the first prototype computer mouse was made to use with a graphical user interface. Douglas Engelbart's computer mouse received patent # 3,541,541 on November 17, 1970 for a "X-Y Position Indicator For A Display System"

computer mouse - original patent

HP CALCULATORS

NOW THREE CALCULATOR

POSTERS ARE FOR SALE!!!

HP  Poster


The Calculator Reference is proud to now offer THREE fabulous calculator posters. The wildly popular Curta Calculator poster, the Calculators of HP poster and the new Curta Calculator poster printed in the German words of the Great Master, Curt Herzstark. Click on any picture to learn more.... -Rick-

NEW HP CALCULATORS!

Hewlett  Packard Publications Hewlett Packard Publications Hewlett  Packard Publications

How the Personal Computer Was Born

How the Personal Computer Was Born

The Smithsonian Institution's "Information Age" exhibit in Washington has long had a section devoted to the early days of personal computing. One of the most prominently displayed early computers in the exhibit is a rare pre-production model of the famous Altair 8800. How that computer got from my Texas office to the Smithsonian Institution is a story worth telling.

The Altair computer sparked the personal computing revolution when this magazine appeared on newsstands nearly three decades ago. Photograph by Forrest M. Mims III. Click image to enlarge.

Twenty-five years ago Popular Electronics magazine featured on its cover a photograph of a build-it-yourself computer called the Altair 8800. An accompanying article by Ed Roberts described how to build the computer. You could even order a complete kit of parts for a little less than $400 from MITS, Inc., Roberts's company in Albuquerque, New Mexico.

Bill Gates and Paul Allen were so excited by the Altair that they left Harvard and moved to Albuquerque. There they formed a tiny company called Microsoft and worked alongside Roberts at MITS to develop software for the Altair. It was 1975, and no one then could possibly have imagined that Bill Gates, an 18-year old college dropout who never slept and who always needed a haircut, would one day become the richest man on Earth.

At a time when business computers sold for tens of thousands of dollars, the Altair was a major breakthrough. Roberts and Gates understood that better than anyone else. Yet by today's standards, the Altair was unbelievably primitive. Instead of a keyboard, information was entered into a clumsy row of toggle switches. The Altair's "monitor screen" was a row of flashing red lights. You had to understand the binary number system to know what they meant.

That meant I had to learn binary, for Roberts asked me to write the Altair operator's manual. Since MITS was near bankruptcy, my fee would be one of the first Altairs. That Altair is the one that has been displayed at the Smithsonian for the past dozen or so years. In next week's column, I'll tell that story. But first, I want to tell you more about Ed Roberts and MITS, the company he and I began.

Roberts and I became friends in 1968 when we were assigned to the Air Force's state-of-the-art laser laboratory at Kirtland Air Force Base in Albuquerque, New Mexico. Roberts was a second lieutenant who had entered the service as an enlisted man and was commissioned when he received his college degree. I was a first lieutenant just back from a year as an intelligence officer in Vietnam.

While working at the laser lab, Roberts and I had long discussions about our common interests in science and electronics. Roberts used to say that his goals were to earn a million dollars before he reached 30, learn to fly, become a medical doctor and move to a farm in Georgia. Roberts also said that he wanted to design an inexpensive digital computer. None of us then realized that the invention of the computer-on-a-chip known as the microprocessor would eventually allow all of Ed Roberts's dreams to come true.

In 1969 I began writing articles on model rocketry and hobby electronics for various magazines. Being an entrepreneur at heart, Roberts noticed this and wondered if we could form a company to sell kits based on the projects I wrote about. We soon held a meeting in Ed's kitchen with two other guys from the lab, Stan Cagle and Bob Zaller. Within a few weeks we had formed Micro Instrumentation and Telemetry Systems Inc. or simply MITS.

My job was to write magazine articles about how to build and use our products. My wife Minnie Minnie and Robert's wife Joan helped package the kits in blue plastic boxes. Our first product was a tiny light flasher I had designed for tracking and recovering test flights of guided model rockets at night.

The flasher circuit was based on a basic code-practice oscillator from Radio Shack that I had modified to send pulses to an infrared-emitting diode used in a miniature travel aid for the blind that I designed in 1966.

We sold a few hundred light flasher kits, but it soon became obvious that MITS needed a bigger market. So I wrote an article about how to build a device that would transmit your voice over a beam of invisible light. Popular Electronics magazine made the project one of their 1970 cover stories, and we sold a hundred or so kits. With this money we began work on a laser system for hobbyists, which was also published in Popular Electronics. Most of our customers were hobbyists, but we also received orders from universities, corporations and even the FBI,

The kit business eventually reached a break-even point. But MITS needed a product with much more appeal. Next time I'll tell you about that product and how it led directly to the Altair 8800.

Forgotten critics said the Altair was simple and plain. But Bill Gates looked inside and saw an electronic brain.

Early History of the Personal Computer

Early History of the Personal Computer

In 1969 Intel was commissioned by a Japanese calculator company to produce an integrated circuit, a computer chip, for its line of calculators. Ted Hoff who was given the assignment was troubled by the fact that if he utilized standard methods of design the Japanese calculators would be just about as expensive as one of the new minicomputers that were being marketed and it would not do nearly as much. Hoff decided he would have to use a new approach to the calculator chip. Instead of "hardwiring" the logic of the calculator into the chip he created what is now called a microprocessor, a chip that can be programmed to perform the operations of a calculator; i.e., a computer on a slice of silicon. It was called the 4004 because that was the number of transistors it would replace. The contract gave the Japanese calculator company exclusive rights to the 4004. Hoff realized that the 4004 was a significant technical breakthrough and was concerned that Intel should not give it away to the Japanese calculator company as part of a relatively small contract. Fortunately for Intel the Japanese company did not realize the significance of what they had obtained and traded away their exclusive rights to the 4004 for a price reduction and some modifications in the calculator specifications.

Intel later developed another microprocessor for the Computer Terminal Corporation (CTC). This one was called the 8008. In this case CTC could purchase the product from Intel but Intel retained the right to market the 8008 to other customers. Intel began to create support for this programmable chip, the 8008. An employee of Intel, Adam Osborne, was given the assignment of writing manuals for the programming language for the 8008. Osborne later became important in the development of the personal computer for bringing about creation of the first portable computer; there is more about this below.

Gary Kildall, a professor at the Naval Postgraduate School in Monterey, worked at Intel to develop a language and programs for their microprocessors. Kildall also played another important role in the development of the personal computer in that he wrote the first operating system for a microprocessor. It was called CP/M. Without an operating system a personal computer is a very awkward device to use.

By the early 1970s there was a vast number of people who had had some experience with mainframe computers and would love to have a computer of their own. In Albuquerque, New Mexico there was a man named Ed Roberts who ran a business selling kits for assembling electronic devices. The company's name was MITS for Micro Instrumentation Telemetry Systems. The company was not doing too well and Ed Roberts was looking for some new products to increase sales. The calculator business was becoming saturated, especially when the chip manufacturers such as Texas Instruments began to market calculators themselves. After a disasterous attempt to sell kits for programmable calculators Ed Roberts was desperate for a new product. He decided to try to do what no one else had attempted, to create a kit for assembling a home computer. He decided to base it upon a new chip Intel had developed, the 8080. Roberts negotiated a contract with Intel that gave him a low price on the 8080 chips if he could buy in large volume. About that time a magazine Popular Electronics, edited by Les Solomon, was looking for workable designs for desktop computers. Roberts promised Solomon a working model if Solomon would promote it through Popular Electronics. Ed Roberts decided to call his computer the Altair after the name of a planet in a StarTrek episode Les Solomon's daughter was watching. Roberts and the MITS people worked feverishly on building a prototype of the Altair to send to Popular Electronics but when the deadline for publication arrived the model was not quite ready. Nevertheless Popular Electronics published a picture of the empty case of the Altair on its front cover. The computer case with its lights and switches did look impressive. An article in the magazine revealed that the kits for the Altair were available for $397 from MITS in Albuquerque, New Mexico.

To everyone surprise computer buffs from all over the country sent in their $397 to buy an Altair kit. In fact, MITS was flooded with money. It went from a state of near bankruptcy owing $365,000 to a situation in which it had hundreds of thousands of dollars in the bank. MITS bank was a bit concerned that MITS had started engaging in something lucrative but illegal.

The Altair had a very limited capability. It had no keyboard, no video display and only 256 bytes of memory. Data input had to done by flipping toggle switches and the only output was the flashing lights in the computer. Nevertheless there was great enthusiasm for the Altair.

Two programmers in the Boston area (students at Harvard actually) decided to develop software for the Altair. Their names were Bill Gates and Paul Allen. They called Ed Roberts and told him they had the programs to run the programming language BASIC on the Altair. Roberts said he would buy it if he could see it running on the Altair. Gates and Allen didn't actually have the programs written but they immediately set out to write them. It took about six weeks. It was an amazing accomplishment that they got it to work. They developed the programs for the Altair by programming a Harvard computer to emulate the limited capabilities of the Altair. They were successful in the development and Paul Allen flew to Albuquerque to demonstrate the result. Given the multitude of things that could have gone wrong it was a miracle that the program worked. It worked however on a more sophisticated lab version of the Altair at MITS rather than the version sold to the general public. Gates and Allen's company Microsoft was founded in Albuquerque and only later moved to the Seattle area.

The members of the general public that sent in their $397 were finding a long, long wait before they received their Altair kit. MITS was just not prepared to handle the volume of business that came in. But MITS showed the demand was there and the market started to work.

Gary Kildall joined forces with a professor from U.C. Berkeley, John Torode, to produce a small computer also based upon the 8080 chip. Torode built computers under the name Digital Systems and Kildall wrote the software under the name Intergalactic Digital Research.

Altair's most effective early competitor was created by IMSAI Manufacturing of San Leandro, California. IMSAI was established by Bill Millard who had no particular interest in computers but knew a hot marketing opportunity when he saw one.

About this time Lee Felsenstein entered the picture. Lee Felsenstein was an interesting individual who played a number of important roles in the development of the personal computer. He had a quite interesting background. He grew up in Philadelphia and became an engineering student. One summer he got a job in the Los Angleles area working as an engineer for an operation that required a security clearance. He loved being an engineer and had no plans for doing anything else. Then one day the security officer where he worked called him into his office to inform him that he would not be given the necessary security clearance. When Lee had filled out the application forms for the job he had stated that he did not know any members of the Communist Party, which he reaffirmed under questioning by the security officer. The security officer then informed Lee that his parents were members of the Communist Party. As Steven Levy reports in his book Hackers, based upon an interview with Lee:

Lee had not been told. He had assumed that "Communist" was just a term, red-baiting, that people flung at activist liberals like his parents. His brother [Joe] had known-- his brother had been named after Stalin!--but Lee had not been told.

The security officer told him that he could not give him a security clearance at that time but if he kept out of political involvements he could reapply in a year or so and probably would get a security clearance then. Lee left the organization and after a while moved to Berkeley in 1963 where the countercultural revolution was in just beginning. Lee went to work on a weekly newspaper called the Berkeley Barb as a technician and journalist. The Barb was a radical newspaper run by Max Scheer. The Barb did not make much money and the staff received no pay other than when Max took them home for his wife Jane to feed them. Later Max started selling advertisement space in the Barb to massage parlors and started making a lot of money. But he still did not pay the staff any salary. This upset many on the staff in a two ways. First they were perplexed at their newspaper calling for social revolution but selling ads to massage parlors and second they were not getting any of that money. A group of the Barb staff, including Lee Felsenstein, left and started another newspaper the Berkeley Tribe. The Tribe was committed to ideological anarchism. Lee managed the Tribe for a while and then entered UC Berkeley and finished his engineering degree. After graduation he joined a communal organization called Resource One and later an offshoot Community Memory which sought to bring computers to the people by installing remote terminals in places of business.

About the time the Altair was announced a group of San Francisco Bay Area computer buffs organized the Homebrew Computer Club. After the club was operating for sometime Lee Felsenstein became the facilitor for the Club, an informal master of ceremonies to direct the meetings and discussions. As many as 750 attended the meetings and they became a major locus of information exchange on computers in the Bay Area. Steve Jobs and Stephen Wozniak attended these meetings. Adam Osborne sold his book An Introduction to Microcomputers at these meetings.

Lee Felsenstein did occasional engineering design work including a computer which was named the Sol after the editor of Popular Electronics, Les Solomon. The Sol would sell for about $1000 but include a lot more capabilities than the Altair. Felsenstein and others were also creating enhancements, such as memory boards, for the Altair. Lee Felsenstein also designed the Osborne Computer, the first portable computer. It was not portable in the sense of a laptop computer that can be used while traveling. It was portable in the sense that it could be conveniently carried from one place to another and there plugged in and used. The size was limited to the dimensions that could fit under a jetliner seat.

What was needed for these microcomputers was a disk drive. Disk drives had long been used with mainframe computers but they were too expensive for the low cost home computers the industry was trying to develop. Ed Shugart, the founder of the first major disk drive manufacturer, did announce the availability of a 5-1/4 inch drive. Gary Kildall while developing the first microcomputer operating system, CP/M (Computer Processor/Monitor), acquired a Shugart disk drive. The story of operating systems for computers is told elsewhere. Operating Systems

INTRODUCTION: THE REVOLUTION BEGINS

INTRODUCTION: THE REVOLUTION BEGINS

The scene is the beautiful but dusty little city of Albuquerque, New Mexico. A small electronics company, unnoticed by most people except for the handful of creditors who haven't receive a check for several months, is quietly working on a project that will have an impact on society literally as revolutionary as the printing press or the steam engine or the automobile. It is the fall of 1974, and unless the project is completed soon, the little company will simply disappear and no one will hear about its dream. The company president, Ed Roberts, has mortgaged his house and stands to lose everything. He's been working day and night, drinking too much coffee and smoking too many cigarettes. His smoker's hack is so bad that he frequently has to cover the mouthpiece on his telephone when receiving calls.

The little company, which only a few months earlier consisted of 85 employees including engineers, production workers, technicians, technical writers, bookkeepers, etc., is now down to fewer than 20 people, most of whom are working for reduced salaries and looking around for something else in cast this project, a kit computer, known internally as "Robert's folly," doesn't get out the door before the bank closes the door permanently.

Roberts schedules a plane trip to New York where he will meet with the editors of Popular Electronics magazine to try to convince them to put the computer on the cover of their January 1975 issue. A prototype of the computer, which as yet does not even have a name, is shipped ahead of time via air freight. It should be there when Roberts arrives.

Roberts arrives in New York and, as Murphy's Law would dictate, the computer has been lost by the air freight company. He nervously places a call to his chief engineer and longtime friend, Bill Yates, who frantically begins building a second prototype. Armed with schematic diagrams, desperation, and a lot of guts, Roberts heads for the offices of Popular Electronics where he will try to show the editors what his computer could do if they only had one.

Fortunately for Roberts and for the revolution, one of the editors, Les Solomon, takes a personal interest in the computer, even going so far as to come up with a name for it, Altair[1]. Solomon has worked with Roberts in the past on calculator projects, and he knows that Ed is a man of his word even if he is working with limited resources. He actually believes that a prototype exists and that Roberts will have another one delivered to Popular Electronics in a couple of weeks if the lost version is not recovered. Even more incredible is the fact that Solomon believes the computer will really work.

In January 1975 the revolution begins. The Altair, which sells as a kit for under $400 and has the same computing capabilities of $20,000- to $50,000-computers, is featured on the cover of Popular Electronics. Expecting to sell 400 computers the first year (enough to pay off the bank debt), the little company is overwhelmed by more than 1500 orders, cash in advance, by the end of February. Roberts, who knows no limits to his entrepreneurial ambitions, takes out a full-page ad in Scientific American and begins talking about the demise of IBM.

The little company starts down the harried road to becoming a bigger company, runs into a huge roadblock called cash flow, and sells out to a big computer company in California, but not before Business Week refers to it as the "IBM of the microcomputer industry." And not before it has developed a whole new market for computers.

At first, there were the computer hobbyists (sometimes referred to as computer freaks). They were experimenters who liked building computers from kits, testing them with their own oscilloscopes, carefully examining each and every part, in many cases developing better computer equipment of their own, and inventing new and marvelous ways to use this amazing tool. Then there were forward-looking people who wanted to use computers in their business and professional lives. There were pharmacists who found they could use computers to speed the time required to fill prescriptions by letting the computer handle the bookwork and print the labels. Small retailers found they could keep both better track of their inventories and more accurate, up-to-date books. Teachers discovered that they could use the computer as an instruction aid and as a method of cutting down the time needed for grading papers and keeping records. People started using computers in their homes, first as a source of entertainment, with complicated computer games, then as recipe files, and as record-keeping, check balancing, and budgeting aids. They began to hook up their computers to other devices, such as their lighting and heating systems, their stereos and alarm clocks. They began to communicate with other home computer users through telehpone hookups and with large computer information libraries. And this was only the beginning.

The little company that started it all was named MITS, and I was one of its 20 employees, in charge of technical writing and advertising. I've grown with this revolution and tried to help it along wherever I could. It has been my role to attempt to explain to outsiders what the hell is going on.

At the time of this writing there were about 100,000 individuals using computers in the United States. In the short time since the idea caught on, over 1000 computer retail stores have sprung up across the country. Personal computing shows held in most major cities have drawn over 250,000 stupified attendants. And personal computers are being produced by 50 different companies, including Radio Shack, Heath, Montgomery Ward, APF, and Commodore.

The reason for the fantastic growth of this industry in just three years is simple: The personal computer represents increased personal power. It gives us the ability to fight back, to cope with the complications of our increasing bureaucratic, paper-ridden society. It is an equalizer in the new world of technology.

For as little as $400 you can buy a computer today and put it to use realizing your own intellectual, aesthetic, and economic potentials. "Power to the people" is perhaps a more fitting slogan when applied to today's personal computer than it was when it was applied to the social movements of the 1960s. Think how potent a weapon the computer can be for the middle-class citizen fed up with soaring taxes, dishonest politicians, inadequate schools, government interference, utility companies, oil companies, mass congestion, and runaway inflation.

After reading this book you will, hopefully, understand computers and how to get started with them; and I hope you will see the vision. The full impact of the personal computer will become more apparent as the technology becomes more refined and as more people discover it. Still, momentous breakthroughs have been achieved, and like it or not, the age of the personal computer is upon us.

A Brief History of the Abacus

A Brief History of the Abacus

This is a brief history of the abacus and a Bibliography follows. The earliest counting devices known to man were his own hands and fingers. If that wasn't enough, things in nature were used like shells, twigs, pebbles, stones, and so forth. It is a good idea to think about the history of arithmetic, mathematics, writing and recorded information. Man's invention of the computer resulted from man's need to quantify, to count and to do mathematic calculations. Long before the computer, in the Roman Empire, Ancient Asia, and other parts of the World, man was inventing easier and faster ways of counting and calculating.
Definition of Abacus

" The abacus is a device, usually of wood (plastic, in recent times), having a frame that holds rods with freely-sliding beads mounted on them." 2
Counting Boards and the Salamis Tablet
The use of the abacus was pre-dated by the use of counting boards. A counting board had grooves along which one could slide beads or stones. The beads or stones did not have holes in them but only grooves along which they moved on the counting board. "The oldest surviving counting board is the Salamis tablet (originally thought to be a gaming board), used by the Babylonians circa 300 B.C., discovered in 1846 on the island of Salamis." 2 "The oldest surviving counting board is the Salamis tablet (originally thought to be a gaming board), used by the Babylonians circa 300 B.C., discovered in 1846 on the island of Salamis. " 2 Ancient Counting Boards is a Web site that further chronicles the history and use of counting boards, the Salamis tablet(about 300 B.C.) and the improvements of the early counting tablets, how they evolved into the first Roman abacus.

Around 1000 AD the Aztec peoples invented a device similar to an abacus which used corn kernels threaded through wooden frames. This was known as a Nepohualtzitzin. 3 An Aztec abacus would have seven "beads" by thirteen columns5. This abacus dated to around 900 A.D.6.
In an article attributed to Mr. Du Feibao1 the abacus was invented in China having already been "mentioned in a book of the Eastern Han Dynasty, namely Supplementary Notes of the Art of Figures written by Xu Yue about the year 190 A.D." It was during the Song Dynasty (960-1127) that Zhang Zeduan at Qingming Festival painted his famous long scroll, Riverside Scences, picturing an abacus lying beside an account book. 1 The abacus was known to the Chinese as suan-pan. 3

Mr. Du Feibao states in his article that the abacus was introduced into Japan during the Ming Dynasty(1368-1644).

What is an Abacus?

What is an Abacus?

Article about the abacus, picture of abacus, the salamis table, counting boards, the chinese abacus, where to buy an abacus, and the history and origins of the abacus, and related links for further study. The abacus was one of the earliest counting devices in Asia and parts of Europe. It is still used today for arithmetic: adding, subtracting, multiplying and dividing.


Traditionally the Chinese abacus has 2 beads in the top section over the horizontal bar and 5 beads in the lower section, for each "column". The upper row beads could each represent one hand. The lower columns could represent the 10 fingers.


picture of Chinese abacus
This picture of a black colored abacus shows an abacus with more depth to it than in the previous photograph. Many an abacus was home-made. In recent years an abacus could be purchased online.

A Brief History of Computers and Networks

Part I

Webster's Dictionary defines "computer" as any programmable electronic device that can store, retrieve, and process data. The basic idea of computing develops in the 1200's when a Moslem cleric proposes solving problems with a series of written procedures.

As early as the 1640's mechanical calculators are manufactured for sale. Records exist of earlier machines, but Blaise Pascal invents the first commercial calculator, a hand powered adding machine. Although attempts to multiply mechanically were made by Gottfried Liebnitz in the 1670s the first true multiplying calculator appears in Germany shortly before the American Revolution.

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading punched holes stored on small sheets of hardwood. These plates are then inserted into the loom which reads (retrieves) the pattern and creates(process) the weave. Powered by water, this "machine" came 140 years before the development of the modern computer.

Ada Countess Lovelace
Ada Lovelace

Shortly after the first mass-produced calculator(1820), Charles Babbage begins his lifelong quest for a programmable machine. Although Babbage was a poor communicator and record-keeper, his difference engine is sufficiently developed by 1842 that Ada Lovelace uses it to mechanically translate a short written work. She is generally regarded as the first programmer. Twelve years later George Boole, while professor of Mathematics at Cork University, writes An Investigation of the Laws of Thought(1854), and is generally recognized as the father of computer science.

The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to create weaves. Developed by Herman Hollerith of MIT, the system uses electric power(non-mechanical). The Hollerith Tabulating Company is a forerunner of today's IBM.

Just prior to the introduction of Hollerith's machine the first printing calculator is introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially successful printing calculator. Although hand-powered, Burroughs quickly introduces an electronic model.

In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a machine he calls the differential analyzer. Using a set of gears and shafts, much like Babbage, the machine can handle simple calculus problems, but accuracy is a problem.

The period from 1935 through 1952 gets murky with claims and counterclaims of who invents what and when. Part of the problem lies in the international situation that makes much of the research secret. Other problems include poor record-keeping, deception and lack of definition.

In 1935, Konrad Zuse, a German construction engineer, builds a mechanical calculator to handle the math involved in his profession. Shortly after completion, Zuse starts on a programmable electronic device which he completes in 1938.

John Vincent Atanasoff
Courtesy Jo Campbell
The Shore Journal

John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry assists. The "ABC" is designed to solve linear equations common in physics. It displays some early features of later computers including electronic calculations. He shows it to others in 1939 and leaves the patent application with attorneys for the school when he leaves for a job in Washington during World War II. Unimpressed, the school never files and ABC is cannibalized by students.

The Enigma

Courtesy U. S. Army

The Enigma, a complex mechanical encoder is used by the Germans and they believe it to be unbreakable. Several people involved, most notably Alan Turing, conceive machines to handle the problem, but none are technically feasible. Turing proposes a "Universal Machine" capable of "computing" any algorithm in 1937. That same year George Steblitz creates his Model K(itchen), a conglomeration of otherwise useless and leftover material, to solve complex calculations. He improves the design while working at Bell Labs and on September 11, 1940, Steblitz uses a teletype machine at Dartmouth College in New Hampshire to transmit a problem to his Complex Number Calculator in New York and receives the results. It is the first example of a network.

First in Poland, and later in Great Britain and the United States, the Enigma code is broken. Information gained by this shortens the war. To break the code, the British, led by Touring, build the Colossus Mark I. The existence of this machine is a closely guarded secret of the British Government until 1970. The United States Navy, aided to some extent by the British, builds a machine capable of breaking not only the German code but the Japanese code as well.



History of Computers

History of Computers

The development of the modern day computer was the result of advances in technologies and man's need to quantify. Papyrus helped early man to record language and numbers. The abacus was one of the first counting machines..
Some of the earlier mechanical counting machines lacked the technology to make the design work. For instance, some had parts made of wood prior to metal manipulation and manufacturing. Imagine the wear on wooden gears. This history of computers site includes the names of early pioneers of math and computing and links to related sites about the History of Computers, for further study. This site would be a good Web adjunct to accompany any book on the History of Computers or Introduction to Computers. The "H" Section includes a link to the History of the Web Beginning at CERN which includes Bibliography and Related Links. Hitmill.com strives to always include related links for a broader educational experience. The material was originally divided into Part 1 & Part 2