I used to work for a disk duplication company (DisCopyLabs) and remember the various pizza parties we would have because someone from Borland was bringing over new Masters to replicate, meaning we had to degauss all of the current inventory of discs and recopy, late into the night.
That was a trippy job, Borland supported the CP/M platform as well as MS-DOS and PC-DOS - That is where I got much of my experience on a wide scale of computers (DEC Rainbow, Eagle, Apricot, Heath, Apollo, Northstar, etc.).
And 8" floppy discs made the best frisbees... If we were really good at it, we could get them all the way over onto San Thomas Expressway from Wyatt Drive. Quite a few up on the roof, as I recall.
I worked in my father’s camera store/photo lab for many years. A photo lab typically has about 3 billion of those plastic canisters film used to come in[1]. During slow hours, we’d have wars flinging the plastic lids, which you could send a good 100 feet by holding them between your index finger and thumb and snapping your fingers.
By envelope I take it you mean the jacket the discs are in and the part you put into the drive and not the envelope paper protective sleeve you put the discs in when not in use.
But yes, the discs as you would put into the computer made greate frisbee's, if you took them out of the protective jacket, they lacked the rigidness (least the ones I played with). I never did a test with a 8" and a 5.25" disc in the open, but from vauge memory recolation, the 8" discs could make about twice the distance.
My personal favorite was punched cards made into simple paper plane with bent over nose to allow it to be catapulted via elestic bands. Add a paper clip or two and they traveled very far and with great force that with two paper clips at 3 meters, you could go thru the side of a coke can. Never made one that made it all the way thru and you also needed somebody prepared to hold the coke can (though had no accidents - lucky). But was pretty dangerous. Though did recall a nice cut above the eye from a 5.25" frisbee net in the office as the corners of discs at speed are not conducive towards human flesh.
These days, none of these things happen as much as they used to on the grounds of HR and health and safety being more vigilant and prominent. Though 3.5" floppy discs proved safe, as any impact upon a hardish surface often saw the disc fall foul more than the target.
I remember this issue after upgrading to a ~350MHz machine way back when.
A quick fix, with no software or patching needed, was to hit the return key and then the pause key at almost the same time after typing "program.exe". About 25% of the time, the pause would trigger an interrupt right in the delay calibration loop, and then pressing any key would resume the calibration loop with enough RTC time passing to avoid the division by zero.
Speaking of Turbo Pascal, it was initially created by Anders Hejlsberg. He later was heavily involved in creating Delphi, C# and TypeScript. Surely these languages were created by teams but if you want to name a single creator it will be Anders Hejlsberg.
Before working for Borland he ran his own company where he created PolyPascal which essentially was the product Borland acquired and turned into Turbo Pascal.
PolyPascal allowed me to write software in what felt like a proper programming language. I had previously learned to code using Basic and Comal 80 but they felt like toy languages compared to Pascal.
It was "Compas Pascal" before it was PolyPascal ... I might still have a Compas original 5.25" floppy somewhere, though even if I do, I have no drive to read it with (and have serious doubt that 30 years later, it would still read even if I did).
I've been on a vintage computer kick, dusted off a 30-year old PC clone. Unless the disks are physically bent, there's a good chance they'll work, a lot of my old ones did.
Somehow I managed to keep most of the code I wrote in TP7 as a kid (mid-nineties - I wasn't very up to date). It's stored safely in Cloud backup today having survived on a single 3.5" disk for many years prior to that.
Sometimes I like to fire up some of the stuff I wrote back then or even just look over the code. Brings back memories of the thrill of getting some of those ideas working for the first time.
Good for you! I've lost most of my "childhood coding" stuff to an accidental `rm -rf /home /something` (note the extra space. Backups? what backups?) some ~18 years ago. That included a couple of DOS viruses, one of which was polymorphic (I wrote them just for fun and never spread them; did upload one to an antivirus vendor BBS back then through their "Submit a new virus" function and it's still in many databases because of that); a C++ rewrite of mars.exe voxel landscape generator, but with added rotation, music and lakes; a PWM-based WAV player for PC speaker; etc. I'm still quite sad about that accident ...
I've definitely kept all my stuff from those days. I even once decided to throw some of it (with little organization beyond a few projects) on my public website:
What's actually more amazing is thinking back to the time distortion of being a kid. Stuff that really only took place over the course of a year or two, feels like 5+ years in terms of my perceptual memory of the era.
I’d love to see this! Imagine if we could see the childhood pictures from famous artists. Not suggesting your code is famous, maybe it is, but still I’m sure you’ve grown a lot since then.
I've thought it would be neat to put it on GitHub by executing a series of commits that have the dates faked according to the original file creation dates.
Probably the most complete project I did was a resource manager for Quake - it allowed you to export and import textures, sounds, vertex defs, etc.
Whether this should be public is another question altogether. Some of my code comments and readme files make me cringe today!
> Turbo Pascal programs start by calibrating a delay loop (so that the Delay function knows how much to spin to achieve a certain delay). The calibration counts the number of times a certain loop is run for 55ms (as measurable using the PC’s timer interrupt with its default setting), then divides the number of loops by 55 so that Delay can then busy-wait in millisecond increments.
Why not use the PC's timer interrupt for the delay function itself?
ms dos compatible != IBM PC compatible. 80186 based PC will still run TP programs, despite Timer0 being mapped to port 30-36H instead of 40-43H, not to mention totally non 8253 compatible.
The slow timer actually works in your favor if you do it right. You don't count interrupts, you count CPU cycles between interrupts (search term: "reciprocal counter.")
This bug was just laziness on Borland's part; there were no excuses for it. There were any number of ways to do it correctly. Fortunately they fixed it pretty quickly, too. While I remember running into it, it wasn't a huge issue.
No excuse? It may have been a left-over from early versions.
The original Turbo Pascal ran on PCs of the day, running at 4.77MHz, and with 64 kilobytes of RAM.
Even if one foresaw computers running at 200MHz, I don’t think that, at the time, it was clear that a single architecture would stay around long enough for that to be a problem. The Pentium Pro peaked at 200MHz in 1995-8 or so, 12 to 15 years after the first release of Turbo Pascal (https://en.wikipedia.org/wiki/Pentium_Pro)
> It may have been a left-over from early versions.
Earlier versions do not have this particular bug, it was introduced in version 7. However they have another bug which causes Delay to simply not work properly (it finishes quicker than requested) and most likely the changes in ver7 were made in an attempt to fix that one.
I think its the very same bug, they just bumped counter from 16 to 32 bit, but left the rest of logic intact, including 16 bits for the result of division.
Reading https://retrocomputing.stackexchange.com/a/12112, they moved the limit from ≈20MHz (num iterations overflowed 16 bits, leading to incorrect computation of the number of loops to run per second) to ≈200Mhz (num iterations divided by 55 overflowed 16 bits, leading to this “divide by zero” message)
(Aside: all things staying the same, that should have increased the limit by a factor of 55. It seems the processor got more efficient running these idle loops by a factor of about five (about 2½ if there’s a sign bit in the result of the division)
So yes, you could call this a different bug. I don’t think having that bug in the code is inexcusable, though.
⇒ I don’t think they could have found this in testing. So, is it inexcusable that this went through code review? I don’t think so. Not only would one have to realize that that new limit now came into play, one also would have realized the higher efficiency of the 200MHz processor(s) would bring the number down further.
Turbo Pascal is old enough that (I think) they probably wanted to keep compatibility with machines that ran MS-DOS but weren't IBM PC-compatible. If that's the case, messing with the timer interrupt would've been a bad idea.
Windows 95 and 98 (but not 98SE) had a similar problem with fast CPUs, also due to an internal delay loop being run too quickly. However, the same problem is not present in earlier versions of Windows (3.x and below) nor DOS, which is interesting because the kernel of 3.1/3.11 is almost identical to Win95's.
I recently wrote a personal productivity app using Laz & FP, and came away really impressed. I immediately installed the tools on three different work systems and joined the community because I can see this being valuable in the future.
As another result of that experiment, I thought about giving Turbo Pascal a try next time I'm in DOSBox, just for nostalgia reasons. Hopefully the issue highlighted here wouldn't be too awful to deal with.
Similar issues (calculating somethings-per-second when the test 'something' started executing faster than expected) caused a good number of games to stop working as CPU speeds increased.
Same for GPUs and actually still happens with many games that use delta timing (meaning they calculate the time delta between the last frame and the current frame and use it to update the game world state). For example in Condemned you often get stuck for a few frames to random junk/pebbles/etc on the floor (and the game has a lot of that) if the game runs too fast. Other games have issues like collisions not registering, jump distances being too long/short, etc.
Because of that i always run any games that are more than 5-6 years old with a frame capping tool.
That's precisely what it was for! In the era just before said games which would attempt to calibrate their delay loops against the realtime clock, games were only expected to run on a single speed of CPU and used hard coded delay loops to achieve their desired speeds. When someone got a new computer that ran twice as fast, the game ran twice as fast too. So as a stopgap, they gave you a button which underclocked your fancy new 386SX from 20MHz to 12MHz or whatever.
(In looking up the clock speeds since it's been that long, I also learned that some 386 chips had bodgy 32-bit multiply logic and were sold as "16 bit", and due to the weird way humans assign value, apparently these are now valuable collectibles! Source: https://en.wikipedia.org/wiki/Intel_80386#Early_problems)
And the turbo button created its own support calls back in the day: "My computer suddenly seems very slow." "Try pressing the Turbo button." "Thanks, that fixed it!"
Sometimes you also needed to switch L1 and/or L2 cache off to get correct speed in addition to pressing turbo button.
If you were unlucky, underclocking was the only option to get the speed just right. Not fun in the era when clock speed was set with mainboard jumpers...
Of course there were also TSR [0] utilities to slow down old games.
I remember twiddling with all that to get some early XT era games working properly. Computers were getting faster so quickly that you'd literally see a new faster model for sale in as little as 1-2 months later. Slowing down things just right amount was an art form in the mid to late nineties.
I remember doing a binary patching of my old programs to fix the issue. And I couldn’t recompile code because Turbo C++ crashed with this problem as well
Lol because it was a fucking complete ide and compiler on a single low density floppy disk.
And what you’re saying is of course impossible.. determining if the function is called by static analysis is not possible.
If input = 1 then delay end
is impossible to analyze statically.
If you meant reference to delay, that is a bit more tractable. But then when do you actually initialize, you will see that program initialization is about the only practical place to do it. Which is exactly what is done.
Yes, I meant referenced in the code, not necessarily called during execution. It makes sense to do it in the beginning if delay is referenced. I guess it was just a small optimization that they decided to do without then.
Nowadays with modern linkers, your delay function could have a strong reference to a delay init constructor which would get dragged in, as needed, and run at program start up time.
That was a trippy job, Borland supported the CP/M platform as well as MS-DOS and PC-DOS - That is where I got much of my experience on a wide scale of computers (DEC Rainbow, Eagle, Apricot, Heath, Apollo, Northstar, etc.).
And 8" floppy discs made the best frisbees... If we were really good at it, we could get them all the way over onto San Thomas Expressway from Wyatt Drive. Quite a few up on the roof, as I recall.
Thanks for the memories...