I bought an Apple II and then a SoftCard. I was trying to learn 'C' and there was a compiler on CPM (Borland) but not on the Apple II.
It is always hard to go back and understand what it was like before an event. Like the Velvet Revolution. But at the time I was working on an IBM 360, mostly doing Fortran for scientists running anemometer simulations. The center for this activity was the person in charge of the 360 who could dole out time on the computer.
The power dynamic was something I did not really notice, but in retrospect this was frustrating for the mathematicians/scientist trying to run simulations. They had to queue up and wait.
Then one day a mathematician brought in an Apple II running VisiCalc. His own personal computer. He ran his simulations on that.
It was like our small world trembled as the tectonic plates of technology shifted. The power shifted just in that one instant. It was cool - how we saw the world changed in one instant.
Steve Wozniak was incredibly foresighted when designing the Apple II, to make sure that expansion cards could disable the default ROMs and even disable the CPU, making this kind of thing possible. The article mentions a chunk of memory "used by peripheral devices"; every expansion card got its own slice of the address space, so you could plug a card in any slot and it would Just Work (maybe you'd have to tell software what slot the card was in). I was very disappointed when I "upgraded" to a 386 and suddenly cards had to be manually configured to non-conflicting IRQs and I/O addresses.
I don't think this is entirely due to Wozniak. Early "home" computer systems were based on connecting cards to a bus (eg the S-100 bus), eg. with one card supporting the CPU, another RAM, a third for disk drive, video card etc, etc. The cards where then memory mapped, presumably you controlled the memory mapping by setting jumpers. (I guess you're saying that Apple II managed this automatically?) Of course the full story might be a bit more complicated: 6502 and 6800 used memory mapped I/O, whereas 8080 (and Z80?) had certain I/O pins coming out of the CPU.
Years later. the Apple Dos Compstibility Card (code named Houdini) could do the same thing. It had a 486DX/2-66 and a Sound blaster card on board. By default it shared the host Mac’s memory and you could run both simultaneously. But it wasn’t a great experience on either side. They both ran slower
Alternatively, you could put up to a 32MB RAM SIMM directly on the card.
Now that I think about it, my first Mac did the same thing with the Apple //e card.
I remember my dad using the Z80 Softcard to run WordStar, which was astonishingly powerful considering how long ago it was king of word processors. I’d be surprised if some of the control keys hadn’t influenced our editors, although as a Vim user I can’t immediately think of any.
1. there is no longer a market for certain sorts of software, whether due to market dominance (Word), or the likely market size being too small to bother with.
2. FOSS has dropped into Code Reuse Mode*, & getting out of that is going to require motivated individuals to build their own, entirely new versions. LibreOffice is Good Enough for most users, so why go to all the effort of starting from ground level when a fork & reskin will do?
one would hope that FOSS would lead to having cool, alternate approaches to particular use problems (as in the old days, when there were myriad word processors on the market — XyWrite, WordPerfect, WriteNow, Word, etc., etc.), but Good Enough means that attention can be put on more interesting problems. what we're left with is a mediocre mass of applications.
*which is why nearly every alternate OS ends up feeling like Linux with missing programs & weird commands, so why not just use Linux? we're going to be stuck in a rut for a long time to come.
> FOSS has dropped into Code Reuse Mode, & getting out of that is going to require motivated individuals to build their own, entirely new versions
I don't necessarily disagree that there are some issues in the ecosystem, but I don't think that's the problem. For starters, I don't think anyone is* forking LibreOffice and throwing on a layer of paint? And when I need a word processor, I personally prefer AbiWord, which is its own thing.
In particular,
> which is why nearly every alternate OS ends up feeling like Linux with missing programs & weird commands, so why not just use Linux? we're going to be stuck in a rut for a long time to come.
This feels backwards. Alternatives tend to present a similar interface without* sharing code. In fact, even just on Linux I'd argue we have a rather lot of (re)implementations of the same things: Consider that we are in a position where Debian is shipping GNU coreutils, Ubuntu replaced them with a rust version (uutils), and Alpine has been happily shipping busybox for years (AFAIK, as long as it's existed).
One of the biggest disappointments of the 8-bit era was the Commodore 128 not being able to use both the 8502 and Z80 CPUs in some kind of coprocessor setup.
I mean Wikipedia is referenced and well sourced so it is a perfectly valid source in this day and age. I read papers weekly and they are full of more lies or dishonesty than Wikipedia nowadays where there is a desire to publish often.
The Old New Thing is very much engineering. Any contemporary engineers who don't think they have anything to learn from the experience of the past as recounted in the blog are doomed to repeat the same missteps.
And much as one would hope that Raymond Chen's blogging is holding up any important Microsoft initiatives, I very much doubt that it's much of a distraction for a megacorporation.
Personally, I prefer cool blog posts over "add another Copilot button that does nothing to something that did not require it anyway" or "paper over a perfectly fine API with a newer version that has 60% of the functionality and 120% of the bugs" (which is what Microsoft engineering mostly seems to boil down to these days), but you be you...
Raymond Chen’s blog posts are one of the best things coming out of Microsoft.
As a Unix person for decades, for me it’s great to see his incredibly experienced and insightful view on software development in general and specifically OS development at Microsoft and to read about his experience with all these nice processor architectures no longer supported by NT.
I bought an Apple II and then a SoftCard. I was trying to learn 'C' and there was a compiler on CPM (Borland) but not on the Apple II.
It is always hard to go back and understand what it was like before an event. Like the Velvet Revolution. But at the time I was working on an IBM 360, mostly doing Fortran for scientists running anemometer simulations. The center for this activity was the person in charge of the 360 who could dole out time on the computer.
The power dynamic was something I did not really notice, but in retrospect this was frustrating for the mathematicians/scientist trying to run simulations. They had to queue up and wait.
Then one day a mathematician brought in an Apple II running VisiCalc. His own personal computer. He ran his simulations on that.
It was like our small world trembled as the tectonic plates of technology shifted. The power shifted just in that one instant. It was cool - how we saw the world changed in one instant.
Steve Wozniak was incredibly foresighted when designing the Apple II, to make sure that expansion cards could disable the default ROMs and even disable the CPU, making this kind of thing possible. The article mentions a chunk of memory "used by peripheral devices"; every expansion card got its own slice of the address space, so you could plug a card in any slot and it would Just Work (maybe you'd have to tell software what slot the card was in). I was very disappointed when I "upgraded" to a 386 and suddenly cards had to be manually configured to non-conflicting IRQs and I/O addresses.
I don't think this is entirely due to Wozniak. Early "home" computer systems were based on connecting cards to a bus (eg the S-100 bus), eg. with one card supporting the CPU, another RAM, a third for disk drive, video card etc, etc. The cards where then memory mapped, presumably you controlled the memory mapping by setting jumpers. (I guess you're saying that Apple II managed this automatically?) Of course the full story might be a bit more complicated: 6502 and 6800 used memory mapped I/O, whereas 8080 (and Z80?) had certain I/O pins coming out of the CPU.
Memory mapping happened automatically. Each card was mapped based on the slot it was in. $C000 - $C700 I believe with each slot assigned 256 bytes.
Clearly Steve Wozniak was a very unique [technical and geeky] guy at that time. Thinking about interoperability at that time was prophetic.
Cool post from Raymond as usual!
I’d like to add that the hardware for the SoftCard was designed by Tim Paterson at SCP about the same time he was writing the future MS-DOS
Years later. the Apple Dos Compstibility Card (code named Houdini) could do the same thing. It had a 486DX/2-66 and a Sound blaster card on board. By default it shared the host Mac’s memory and you could run both simultaneously. But it wasn’t a great experience on either side. They both ran slower
Alternatively, you could put up to a 32MB RAM SIMM directly on the card.
Now that I think about it, my first Mac did the same thing with the Apple //e card.
I remember my dad using the Z80 Softcard to run WordStar, which was astonishingly powerful considering how long ago it was king of word processors. I’d be surprised if some of the control keys hadn’t influenced our editors, although as a Vim user I can’t immediately think of any.
Turbo Pascal and other Borland products used to use keys based on WordStar. These days JOE (Joe's Own Editor) still uses a similar keyset.
WordStar was basically all we needed, and it still is.
Imagine if you had something that small and powerful today.
https://archive.org/details/wordstar_202310
> WordStar was basically all we needed, and it still is. > > Imagine if you had something that small and powerful today.
I completely agree with the first part. But why do you think we don't have that today, if we choose to do so?
1. there is no longer a market for certain sorts of software, whether due to market dominance (Word), or the likely market size being too small to bother with.
2. FOSS has dropped into Code Reuse Mode*, & getting out of that is going to require motivated individuals to build their own, entirely new versions. LibreOffice is Good Enough for most users, so why go to all the effort of starting from ground level when a fork & reskin will do?
one would hope that FOSS would lead to having cool, alternate approaches to particular use problems (as in the old days, when there were myriad word processors on the market — XyWrite, WordPerfect, WriteNow, Word, etc., etc.), but Good Enough means that attention can be put on more interesting problems. what we're left with is a mediocre mass of applications.
*which is why nearly every alternate OS ends up feeling like Linux with missing programs & weird commands, so why not just use Linux? we're going to be stuck in a rut for a long time to come.
> FOSS has dropped into Code Reuse Mode, & getting out of that is going to require motivated individuals to build their own, entirely new versions
I don't necessarily disagree that there are some issues in the ecosystem, but I don't think that's the problem. For starters, I don't think anyone is* forking LibreOffice and throwing on a layer of paint? And when I need a word processor, I personally prefer AbiWord, which is its own thing.
In particular,
> which is why nearly every alternate OS ends up feeling like Linux with missing programs & weird commands, so why not just use Linux? we're going to be stuck in a rut for a long time to come.
This feels backwards. Alternatives tend to present a similar interface without* sharing code. In fact, even just on Linux I'd argue we have a rather lot of (re)implementations of the same things: Consider that we are in a position where Debian is shipping GNU coreutils, Ubuntu replaced them with a rust version (uutils), and Alpine has been happily shipping busybox for years (AFAIK, as long as it's existed).
> But why do you think we don't have that today, if we choose to do so?
Network effects.
This is great, I’m building new machines on the 6502 and can use this. Thanks.
I wonder if anyone ever used the Z80 Softcard or one of its many clones to run something different than CP/M?
I got MP/M working on the softcard back in the day. Never really had an application for it though.
One of the biggest disappointments of the 8-bit era was the Commodore 128 not being able to use both the 8502 and Z80 CPUs in some kind of coprocessor setup.
"According to Wikipedia..." aargh Wikipedia is not the source!
Maybe in 2005, but in 2025, Wikipedia is more reliably accurate than many more-official-sounding sources.
I mean Wikipedia is referenced and well sourced so it is a perfectly valid source in this day and age. I read papers weekly and they are full of more lies or dishonesty than Wikipedia nowadays where there is a desire to publish often.
Would be cool if Microsoft would focus on engineering instead of blog posts
The Old New Thing is very much engineering. Any contemporary engineers who don't think they have anything to learn from the experience of the past as recounted in the blog are doomed to repeat the same missteps.
And much as one would hope that Raymond Chen's blogging is holding up any important Microsoft initiatives, I very much doubt that it's much of a distraction for a megacorporation.
RE "....Any contemporary engineers who don't think they have anything to learn from the experience of the past....." 100% correct
Personally, I prefer cool blog posts over "add another Copilot button that does nothing to something that did not require it anyway" or "paper over a perfectly fine API with a newer version that has 60% of the functionality and 120% of the bugs" (which is what Microsoft engineering mostly seems to boil down to these days), but you be you...
They don't have an engineering problem, they have a management problem which ruins and obstructs anything good their engineers might try to make.
Raymond Chen’s blog posts are one of the best things coming out of Microsoft.
As a Unix person for decades, for me it’s great to see his incredibly experienced and insightful view on software development in general and specifically OS development at Microsoft and to read about his experience with all these nice processor architectures no longer supported by NT.