Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.

The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.

1: https://news.ycombinator.com/item?id=47717334



> The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).


Ah, I was under the impression that it had a native chunky mode but it was a built-in C2P routine? Anyhow, seems it was useful (1) when running on stock CD32's but not in conjunction with faster machines.

1: https://forum.amiga.org/index.php?topic=51616.msg544232#msg5...


Which brings me to my pet peeve, the already slow 68020 (680ec20) at 14MHz was crippled by, even though it had a 32-bit bus, was only connected to a 16-bit RAM bus. (Chipram.)

This 16-bit memory (2 megs) is also where the framebuffer and audio lives, so the stock CPU in A1200 has to share bandwidth with display signal generation and the graphics and audio processing.

All-in-all, it meant the Amiga 1200 had only about twice the memory throughput of the Amiga 500. (About 5 megabytes/s vs about 10 megabytes/s)

If the A1200 had at least some extra 32-bit memory (it existed as a third party add-on) the CPU could have had its own uncontested memory with a troughput of about 20-40 megabyte/s.

Imagine the difference it would have made if the machine had just a little extra memory.

That's just a tiny detail. That the chipset wasn't 32-bit was another disappointment.

The bigger problem was that Commodore as a company was aimless.


Yeah, and it took ~7 years to make those marginal improvements over the earlier Amiga chipset! I'm ignoring ECS, since it barely added anything over OCS for the average user.


Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"


Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.

If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.

By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.


5 sins in 1992: - 8 bit planar instead of chunky - progressive display (vs interlaced) - sound was not 16-bit - should have been 68030 with mmu support (vs 68020ec) - HD mandatory.

If they addressed this, the Doom experience would have run better on Amiga.


The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…


Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).

Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.


The 68060 is pretty good to be fair, but it never ended up being widely used and Motorola definitely saw PPC as the future.

Maybe if these theoretical new 68k Amigas became a huge market hit they could have taken the arch further and it could have remained competitive, but all the other 68k shops had already pretty much given up or moved on already (Apple was already going PPC, Sun went SPARC, NeXT gave up on their 68k hardware, Atari was exiting the computer business entirely, etc) so I don’t know that the market would have been there to support development against the vast amount of competition from both the huge x86 bastion on one hand and the multitude of RISC newcomers on the other.


Right, and I think that is a junction. Had Motorola not been enamoured with the new shiny as a chipcompany and realized that they already had a huge market that just wanted improved performance of their software and pushed 68k improvements instead of a new PPC architecture, both Apple and (a better managed) Commodore could've been competitive with improved 68k designs.

Remember, Intel also barked up the wrong tree with Itanium for 64bit and didn't really let go until AMD forced their hand with x64.


The argument is that 68k is "CISCier" than x86, the addressing modes in particular, so making a performant modern out-of-order superscaler core that uses it would be harder than x86.


I believe in that. But Commodore could have plunked a cheap 68020 in their machines for backwards compatilibity (like how MSX2 had a SOC MSX1 inside, PS2 had a PS1 SOC, PS3 had a PS2 SOC, and so on) and put another "real" socketed CPU as a co-processor. Or made big-box machines with CPUs on PCI cards, for infinite expansion options. "True" multitasking, perfect for CAD, 3D rendering and non-linear video editing. It would have been very cool with an architecture where the UI could be rendered with almost hard realtime and heavy processing happened elsewhere.


This is almost exactly what the plan was, until C= went out of business:

https://en.wikipedia.org/wiki/Amiga_Hombre_chipset

It was going to be HP PA-RISC based and have an AGA Amiga SoC, including a 68k core.


How much of Hombre is myth-and-legend? Given how little progress with made with OCS->ECS->AGA, it seems unlikely they could even have built an Amiga SoC, nevermind designed a new 64-bit chipset.



Don't agree there considering x86 has MODRM, size-prefix(16/32 and later 64bit operand sizes), SIB(with prefix for 32bit), segment/selector prefixes,etc.

Biggest difference perhaps where 68000 is more complicated is postincrement but considering all the cruft 32bit X86 already inherited from 8086 compared to the "clean" 32bit variations of 68000 I'd make it a toss at best but leaning to 68000 being easier (stuff like IP relative addressing also exists on the RISC-y ARM arch).

Apart from addressing the sheer number of weird x86 instructions and prefixes has always been the bane of lowpower x86.


There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different (edit: mutually exclusive even!) tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.

Edit: I don't mean that their success was certain if they executed better. I mean they did almost nothing and got the guaranteed outcome: failure. (And their engineers were brilliant but had very little resources to work with.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: