Talk:Altair 8800
This article is rated B-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||
|
A fact from this article was featured on Wikipedia's Main Page in the On this day section on December 19, 2004, December 19, 2005, December 19, 2006, and December 19, 2007. |
CPU clock speed?
[edit]Anyone know what speed the processor ran at? There's absolutely nothing about it anywhere on the page. A bit strange when that's normally a pretty fundamental part of any computer specification, especially the earlier models. The IMSAI 8080 page suggests that machine ran at 2MHz (or in some cases up to 3MHz with a 8085 installed), but there's no particular reason why the Altair would necessarily have used the same clock. Maybe it was a bit variable because of the machine's somewhat homebrew nature, but there must surely have been some quasi-standard operating speed imposed simply by whatever crystal they were able to most cheaply get hold of in bulk quantities (e.g. the NTSC colourburst as used by a lot of 8-bits and even some 16-bits for 1.02, 1.79, 2.38, 3.58, 4.77, 5.34, 7.16... MHz speeds) and include as a default in the kits...? In which case it should be a reasonably well known quantity? 51.7.49.27 (talk) 17:04, 18 September 2018 (UTC)
The Altair 8800 shipped from MITS with a 2.000 mhz crystal per the mits parts list. GOOGLE mits altair parts list, its readily available on the web — Preceding unsigned comment added by Dive614c (talk • contribs) 00:55, 12 March 2019 (UTC)
Weasel Words?
[edit]I've spotted some phrases such as "Programming the Altair was an extremely tedious process" and "No particular level of thought (or rushed design) went into the design", which appear to be weasel words. Any confirmation on this? 220.236.18.233 08:02, 26 November 2006 (UTC)
- Pretty much the whole article is in the wrong tone. :-( As the tag says, it's more like reading a mag than an encyclopedia. The weasel words is just part of it. — Northgrove 08:17, 15 August 2007 (UTC)
- The tone may be off, but I expect the description is still pretty true. Consider how long it would take you to enter anything meaningful into a computer when it required first encoding each byte into binary (either mentally, or with reference to a code sheet), flipping an average of four fairly robust toggle-switches, and then hitting a separate "load" switch... for every single character. Those two sentences I just typed would probably have taken about a half hour.
- As for the bus, it was a fairly clear, self-admitted hack-up job. Take whatever the cheapest sufficiently-large connectors you can find are, take your processor, other necessary circuit boards, and power supplies, and do the bare minimum amount of work (as it would also be quite a long winded and tricky job just to get any kind of working system drawn up using those semi-random parts) to route the various signals coming off the CPU and the support hardware back and forth to where they need to go through the connector, preferably with the shortest possible traces / least amount of PCB-area take as your guiding principle on what goes where and everything else being a secondary concern. I mean, you're building the first practical expandable micro here, as a hobbyist effort, somewhat on the hoof, without any real useful guides on good design practice to work from. There's going to be some mistakes.
- However, that bit about the shorting out of adjacent power rails seems a bit odd - unless there was a manufacturing problem with the slots and/or cards, that shouldn't really be a problem other than when inserting or removing the cards themselves (as often they might start/end up entering/exiting the slot at a slight angle that could bridge adjacent contacts together)... and that's not something you should do with the system under power regardless of the architecture, because shorting *any* adjacent traces together, other than maybe two grounds, could be a recipe for disaster. And whilst 16 (or is it 18?) volts is relatively high for computer internals, it's still far less than what we'd generally consider necessary to cause arcing between conductors spaced that far apart. Methinks this may have been a problem one particular Altair hacker ran into whilst hotswapping cards in a gung-ho fashion and not really an issue for everyday users... 51.7.49.27 (talk) 16:58, 18 September 2018 (UTC)
Toy or machine?
[edit]Right now the article just describes 8800's history. It doesn't say anything about the function. Right now, I don't see any useful aspects in this machine. It can only make LEDs blink. May it be used for something more useful? --134.91.77.152 11:51, 19 December 2005 (UTC)
- You need to understand that the Altair 8800 pre-dates the software industry. When it hit the market, it was literally a piece of hardware that had no software until you wrote it yourself. This is why it was sold as a kit that you had to build yourself. i.e. It was marketed to hardcore techie hobbyists who liked tinkering with gizmos and gadgets that didn't necessarily perform any practical function. e.g. Ham radios. The Altair had tons of potential... it just had to wait for people who were motivated enough to tap it. One of Microsoft's earliest endeavors was to write apps for the Altair. Druff (talk) 00:52, 19 February 2014 (UTC)
- I don't think the LEDs were just used for blinkenlights; they were actually used for data output. Personal computers haven't always had monitors. That said, I think there were add-on cards for Altairs which allowed them to be hooked up to CRT terminals.
- Whoops, when I removed the signature from the post above I forgot to add in the edit summary that it was in preparation to vanish. Consider this one the same.
Yeah I've been scratching my head over this for a long time. I know that the lights must have meant something, binary values of each decimal number or something? What could the machine do, what was it's killer application besides BASIC. 86.143.234.154 07:44, 2 September 2006 (UTC)
- I worked at the Computer Mart of Orange. In 1976, we sold IMSAIs, not Altairs. The hobby industry made these early micros useful quickly. Even back then we would recommend to our customers that they buy a minimum of 8K of static RAM, a 3P+S serial card, a Tarbell Cassette Interface, and a terminal. Bootstraps still had to be entered by hand through the front panel. RastaKins (talk) RastaKins (talk) 15:06, 4 May 2024 (UTC)
In the January 1975 Popular Elecronics article, a sidebar is dedicated to listing possible applications. (visible at http://www.computermuseum.20m.com/images/popelec/Page%2038.jpg) JimH443 23:12, 12 May 2007 (UTC)
A stock Altair 8800 (with no additional IO cards) appears to be a very limited beast. Though the article's description of the programming task is essentially correct (flipping switches to input the binary opcodes), the description of the output isn't quite spot-on. The LEDs of the Altair are connected directly to the address bus and data bus. So, rather than writing a program to "make the lights flash", when the CPU was executing instructions, the lights would flash wildly, reflecting whatever signals were present on the bus. If the program wasn't branching, you could discern the binary incrementing of the address lines, but that's about it.
When the user was programming the Altair, the CPU is halted. The RAM in the Altair was Static RAM, and didn't need to be refreshed. Essentially, during programming, nothing is happening inside the machine. After flipping the switches to specify the next instruction, the user depressed the (spring loaded) "Deposit" switch. This action manually clocks the bus 1 cycle, thereby latching that byte into the specified RAM location.
As I understand it, a common early Altair programming technique was to write a program that performed some calculation and then wrote the result into a certain RAM location (ofter looping and doing this operation many times over...why not?). The user would then halt the program (and indeed the CPU), and then use the switches and LEDs on the front panel to examine the contents of the target RAM location.
Altair basic
[edit]The Altair basic section is kind of confusing.--Gbleem 20:24, 19 December 2005 (UTC)
Price?
[edit]I don't see the machine's price mentioned anywhere here... --Golbez 16:02, 14 July 2006 (UTC)
According to http://www.computermuseum.20m.com/popelectronics.htm, "... can be built for under $400" (this page links to an image of the article taken directly from the magazine) JimH443 23:06, 12 May 2007 (UTC)
Here is a link to the March 1975 price list.[1] The Altair 8800 kit version is $439, the assembled version is $621. A working system would cost around $2000 -- SWTPC6800 23:20, 12 May 2007 (UTC)
I bought mine in 1975 for the "Cover of Pop tronics" price of $395. At least that's what I recall! WardXmodem (talk) 17:21, 2 September 2012 (UTC)
I've never commented before on any wikipedia page or contributed directly to any page, but I have noticed that on this page (linked to from Kaypro -> CP/M -> Altair 8800) as well as others that there are often listed references to "today's dollars." I don't know exactly how to go about correcting those references regarding products like consumer electronics. The fact is that a "cutting-edge" computer system of 1974 cost approximately $2K, and in 2014 a comparable system... $2K!! The entire reason that the tech-boom has happened over the last forty or so years is precisely because it "beats" inflation over time. If the consumer electronics industy suffered inflationary economic pricing akin to other consumer industries, we'd still be playing "Combat" on our 2600's! An example I give often... New Atari 2600 about $200, new NES about $200, new Playstation about $250, new XBox about $300...etc. If the price of the Atari is adjusted for inflation, it rolls in around $1500!!! Nobody would still be playing video games or surfing the internet if economic inflation had anything to do with consumer electronics and personal computers! I guess the ultimate solution would be to simply remove those references to "today's dollars," as they provide no real sense of reference to the real world value of the product. — Preceding unsigned comment added by 65.27.233.64 (talk) 12:53, 1 October 2014 (UTC)
- Stop being silly. The price of a computer may not inflate, but that doesn't mean the entire economy no longer suffers inflation just because of the invention of the microcomputer. The key thing is, a (say) 2018 dollar is "worth" a lot less than a 1973 one. Over those intervening 45 years, regular inflation has devalued the currency. You can buy a lot less stuff for ten bucks now than you could almost half a century ago. And, on the whole, wages have sort of kept pace with that; people are now paid a lot more per year in raw dollar terms, but their actual spending power is about the same. So, we have the example on the page at the moment of the Intel 8080's original sales price of $360 being equivalent to $1700 in the modern age (that is, whenever that edit was made; it may already be slightly out of date), which means overall inflation has been somewhere around 450% in-between. In other words, to buy goods of the same general value as an early to mid 1970s dollar, you now have to spend $4.50.
- Therefore, although the dollar price of computer hardware may have stayed fairly constant over nearly half a century (with, instead, its complexity and power increasing exponentially according to Moore's Law), its actual value, and notional "cost", has deflated considerably. Dropping $500 on a computer isn't seen as too massive an expense these days; in 1973, that was a comparatively huge amount of money. It was as large a chunk of your general expenditure then as spending about $2250 would be today. That's how we make the otherwise fairly meaningless sales prices of 45 years ago more relevant and understandable to a modern audience, by reporting not just their raw nominal figure, but also relating an idea of what that spend would actually have meant vs your household budget, by applying an inflation adjustment to it.
- I would have hoped this didn't really need spelling out, you know? People earned a lot less money, in pure dollar terms, in the 70s vs what they do today. $439 then would be roughly equivalent to $2k today. Your Altair kit wasn't a "cheap" commodity, it was more like the value of a high end MacBook Pro or a beefy gaming PC, and that was just for the basic version that arrived in pieces, with a quarter kb of RAM and no keyboard, screen, or offline storage.
- If you wanted a useful amount of memory, a human-facing interface that operated in terms of accepting typed "English" input and displaying similar output, some kind of backing store, and an operating/programming system that could actually do anything other than display memory locations and stream data in/out of them (e.g. BASIC or CP/M), then you were talking easily $1000-plus, or up into the realm of maybe $5000 today. That's not far off the price of a cheap brand-new car, easily the price of an okay motorcycle or used car, could just about be used as the mortgage deposit on a house, etc, or would represent a very high end desktop workstation or more likely a rackmount server-class machine. It's a pretty large chunk of anyone's annual wage, and was significant more that it did at least make it affordable for someone who really wanted it and could save up (similar to the schemes Ford ran to enable their workers to buy Model Ts), whereas any previous system was completely into the realms of fantasy (e.g. the Data General Nova that heavily inspired the Altair had a base price of $3995, which would have been an entire year's wage for a lot of people). But if we just say "it cost $439 basic to about $1200 for a properly usable model", that doesn't convey the same "which was actually really expensive for an electronics hobbyist and still a considerable investment for most businesses" gravitas that noting "(about $2200 to $5000 in late-2010s money)" does. 51.7.49.27 (talk) 16:23, 18 September 2018 (UTC)
Solomon
[edit]Who is Solomon? —The preceding unsigned comment was added by 196.25.255.246 (talk) 16:34, 12 December 2006 (UTC).
It seems that Solomon is the young daughter of the 'Popular Electronics' magazine editor as mentioned in the Old-Computer site at http://www.old-computers.com/museum/computer.asp?st=1&c=62. --Jtravis06 07:18, 16 December 2006 (UTC)
First World Altair Computer Converence held in Albuqerque New Mexico on March 26-28 1976.
Report on a seminar held on Saturday evening.
MITS Computer Notes, April 1976, Page 7
Computer Power of the Future
Annette Milford
Les Solomon, editor of Popular Electronics, told the Saturday night crowd of 700 that five years ago he and Ed Roberts, MITS' president, were speculating about whether it might be possible to sell 200 Altairs and break even.
Les Solomon entertained a curious audience with anecdotes about how it all began for MITS. The name for MITS' computer, for example, was inspired by his 12-year-old daughter. "She said why don't you call it Altair--that's where the Enterprise is going tonight."
SWTPC6800 02:41, 26 December 2006 (UTC)
Spoiler
[edit]The spoiler template appears to be of little use: the warning is placed before it is clear which movie is spoiled. But then again, maybe the fact that the Altair 8800 appears in Malcolm in the middle is the whole spoiler? ElMorador 11:30, 19 December 2006 (UTC)
Sources
[edit]I don't see any. —Preceding unsigned comment added by 64.81.227.133 (talk) 21:22, 14 October 2007 (UTC)
Memory size?
[edit]How much memory was there in that thing? The article doesn't say. --207.176.159.90 (talk) 03:15, 19 December 2007 (UTC)
- The original Altair 8800 came with 256 bytes of RAM. To get a useful system you would need to purchase additional memory boards. The early boards would hold 4K bytes and soon there were 8K and 16K boards. The Intel 8080 CPU would address 64 K bytes. A system that used audio cassettes for data storage was usable with 8K of memory. A floppy disk system would need a least 12K or 16K. In December 1976 a 4K RAM board cost around $160. -- SWTPC6800 (talk) 05:21, 19 December 2007 (UTC)
- As it says the supplied board was a "1024-word" one, does that mean it was mostly unpopulated and could be upgraded to 1kb if you bought the appropriate chips separately? Which would at least have made the machine somewhat more useful, especially in combination with some cheap terminal (possibly a modified electric typewriter), than a mere 256 bytes would (which is only really suitable for toggle-switch and lamp-readout use). I can't imagine MITS would have considered a word to consist of just 2 bits... 51.7.49.27 (talk) 16:25, 18 September 2018 (UTC)
Publication date
[edit]The introduction date for the Altair computer has been reported as December 19, 1974. I have never found a source for that date. The June 1975 issue has this:
- Ogdin, Jerry (June 1975). "Computer Bits". Popular Electronics. 7 (6). New York: Ziff-Davis: p. 69.
{{cite journal}}
:|pages=
has extra text (help) "The break through in low-cost microprocessors occurred just before Christmas 1974, when the January issue of Popular Electronics reached readers … "
I made a visit to the Copyright office in the Library of Congress to find the correct date. The reading room is LM-404 in the James Madison building; 101 Independence Avenue, S.E., Washington, D.C. The pre 1978 records are hand written on 3 by 5 cards and stored in card catalogs drawers, just like an old library. The published date is when it went to the printer, not when it was mailed to the reader.
The January 1975 issue of Popular Electronics was published on November 29, 1974; registration number B99920. The January 1975 issue of Radio-Electronics was published on December 19, 1974; registration number B992753. -- SWTPC6800 (talk) 01:09, 22 May 2009 (UTC)
- It was registered to the U.S. Copyright Office on that date, but the issue was not published until a week before Christmas '74, per Mims himself. Secondary sources by and large say that December 19, 1974 was the exact date it hit the newstands. DigitalIceAge (talk) 20:44, 25 December 2024 (UTC)
Continuing the article with some details?
[edit]Is it appropriate based upon other TALK that details be added to the article?
Let me just type off the top of my head a possible addition to the article, though it would read more like a BLOG than an encyclopedic entry - but I think Wikipedia is -- or I think Wikipedia should be -- more "friendly" -- as long as it is 100% factual.
The Altair's appeal, was not in what it could do, but in what it was - it was -- at $395 -- an affordable computer. Some people purchased it with no purpose in mind except to "own a computer", and "play with it".
With your $395 purchase for the kit, you got the chassis with front panel, power supply, CPU board, and I believe (need to check sources) 2K static RAM memory board, but populated only with 256 bytes. need confirmation - I recall 2K, previous said 256 bytes the chip was a 2102, which was 1024 x 1, so ... I can't remember that initial 256 bytes, i.e. where it was etc. Sorry.
- It's possible the expansion boards used 2102s or similar, with a bank of 32 of them on the 4k and a full 64 (maybe 4 rows of 16) on the 8k board, but the base-level processor board (especially in the kit), in order to be as cheap as possible, may have used Intel's slightly older 1101 (the company's first ever product, in fact) which probably saw something of a price drop on introduction of the higher density parts. That was only a 256 x 1, so one 8-chip bank of those = 256 bytes. With space on the board for another 24 to bring it up to 1kb. If we assume the 8k boards were absolutely rammed solid with ICs, and they would have been about the same size or slightly larger than the 1101s, that leaves half of a similar-sized processor board free for the actual CPU and all the support chips it required. As memory densities increased and prices of low density parts decreased, it's possible that the basic systems might have gone over to including two banks of 2102s instead (1k x 1, x16, = 2kbyte), or even just a discrete memory card packed out with 1101s; if a similar 2102-using board offered 8k, then that cheaper pack-in option would have provided just 2k instead. And of course, there were a lot of partial failures in the early semiconductor memory fabbing processes - an awwwwful lot of early computers had somewhat odd memory sizes and unexpectedly high chip counts from using half-faulty chips relabeled with concealing code numbers, wired up so that "bad" half was permanently excluded from being accessed... a "2k" board could be a 4k one loaded with half-bad 2102s given a slightly different code indicating that they actually only had 512 x 1 available.
- Still, as demonstrated by the microcontroller ecosystem, and even things like the Atari 2600, there's a lot you can do with that (or with 1k, or even 256 bytes), if you load in machine code and then use the computer (and it's general IO ports/functions) for e.g. process control or the like, or install a ROM board that holds the code you want to run and only use the RAM for live data and working space. And a lot of those uses don't require a proper keyboard or VDU either; you can just get by with the front panel switches and lights to control what the program does once it's running and display status, maybe with overlays that replaced the original labels (I believe there were bitwise instructions both for sensing how the switches were set, and for changing what the lights showed), or expand to a fairly simple numeric keypad plus a 7-segment, nixie, light-board or one-or-two-line alphanumeric readout or a printer. Think of all the clearly microcontroller driven things in your home that lack an alphanumeric keyboard and high resolution character or bitmap display yet do just fine - at the time the Altair came along, there simply weren't any existing options for that kind of control, so a relatively compact, shoebox sized all-in-one computer would have seemed like a perfectly good option instead, particularly if its functionality could be shared amongst several devices so you didn't need to add a shoebox-worth of volume to every single machine but could wire them in to a central control hub. After all, the i8048 which offered remarkably similar specifications ended up being the workhorse of a whole load of single-board controller solutions, from dishwashers, microwaves and VCRs through to car engine management and all kinds of industrial machinery in the late 70s/early 80s. 51.7.49.27 (talk) 16:43, 18 September 2018 (UTC)
Your only input with the base configuration above was the front panel switches, and the output was the LEDS which were hooked to an 8-bit output port.
Nevertheless, you could do a lot with nothing more:
- Even without an Altair you could learn the 8080 instruction set
- ...but WITH ONE you WANTED to learn it
- Once you learned the front panel:
- how to address memory
- how to store bytes
- how to read back bytes
- you could enter small programs, address the first byte of them, and press "run".
If for example, you wrote a program (pseudocode):
load a value of 23 hex into the accumulator output the accumulator to the port which addresses the front panel halt
When it ran, and the lights showed
_ _ # _ _ _ # # <= for they were laid out in an octal grouping
then you had the thrill of having learned enough of a new computer language, how to program and run it, and get the right answer.
It would be significant to know (1) how many never finished the kit; (2) how many finished it but never played with it; (3) how many were satisfied with it in its base configuration with just switches and LEDS, and (4) some examples of the height to which it rose (such as being the basis for my inventing/programming the Xmodem protocol, and with Randy Suess ably handing the hardware, inventing and programming the first BBS).
I could go on with a discussion of the options available for it in the early years, including the pitfalls (very finicky 4K dynamic RAM board), the lack of ROM so you had to "switch in" your program -or- a bootloader for some other device (from hand pulled paper tape in an optical reader, to full blown operating systems).
Then there was the huge explosion in boards for the Altair bus, later renamed S-100 when "clone" computer manufacturers supported it but didn't want to advertise the competition.
Among the most significant (TO ME, so not appropriately "encyclopedic"
- Processor Technology Video Display Module (VDM)
- Memory-mapped 16 x 64 character display
- (Processor Technology?) 3P+S
- Three bidirectional parallel ports, and a serial port, creating the foundation for most who used "serial terminals" as their "console" particularly for the CP/M operating system.
- additional memory boards (after the reliability problems in the Altair 4K dynamic memory board)
- various EPROM boards for their ability to hold boot loaders for peripherals.
I could go on for hours. but won't. Not without some idea as to whether my history in buying, building, and enhancing my Altair, is of interest to anyone in the context of "Wikipedia" where it was commented articles should be encyclopedic. This implies Wikipedia is NOT a place for say a blog, or a dry product review, etc.
Let me know your thoughts. WardXmodem (talk) 17:17, 2 September 2012 (UTC)
User interface and usability
[edit]What was the popular way of interacting with the Altair 8800? Considering that the machine could be instructed by BASIC, I find it hard to believe that switches and lights where the common way of interaction. --Abdull (talk) 17:32, 1 January 2013 (UTC)
- A fairly common accessory would have been a serial port board connected to either a Teletype or some kind of terminal. --Wtshymanski (talk) 04:21, 4 January 2013 (UTC)
Further sections needed
[edit]This article could do with further details on:
- total sales for 8800, 8800a and 8800b.
- When was last one made.
- Notable musueums with Altair exhibits.
- Current interest / collectable status.
John a s (talk) 14:58, 7 December 2013 (UTC)
External links modified
[edit]Hello fellow Wikipedians,
I have just added archive links to 4 external links on Altair 8800. Please take a moment to review my edit. You may add {{cbignore}}
after the link to keep me from modifying it, if I keep adding bad data, but formatting bugs should be reported instead. Alternatively, you can add {{nobots|deny=InternetArchiveBot}}
to keep me off the page altogether, but should be used as a last resort. I made the following changes:
- Attempted to fix sourcing for http://startup.nmnaturalhistory.org/gallery/notesViewer.php?ii=76_4&p=7
- Attempted to fix sourcing for http://startup.nmnaturalhistory.org/gallery/notesViewer.php?ii=75_8&p=2
- Attempted to fix sourcing for http://startup.nmnaturalhistory.org/gallery/notesViewer.php?ii=75_10&p=3
- Attempted to fix sourcing for http://startup.nmnaturalhistory.org/gallery/item.php?ii=26
When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}
).
An editor has reviewed this edit and fixed any errors that were found.
- If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
- If you found an error with any archives or the URLs themselves, you can fix them with this tool.
Cheers.—cyberbot IITalk to my owner:Online 19:35, 29 March 2016 (UTC)
- B-Class Computing articles
- High-importance Computing articles
- B-Class Computer hardware articles
- High-importance Computer hardware articles
- B-Class Computer hardware articles of High-importance
- All Computing articles
- Selected anniversaries (December 2004)
- Selected anniversaries (December 2005)
- Selected anniversaries (December 2006)
- Selected anniversaries (December 2007)