Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suspect the roman numeral system played a bigger role in retarding roman mathematics than the reddit postings suggest.

After all, it's fairly well known that the characteristics of a particular programming language have a strong influence on the way the language is used. (For instance, people don't do OOP or FP using C.)



Roman numerals are profoundly misunderstood by most people today, whose main knowledge about them is that various authority figures use them as an example of an awkward and cumbersome precursor to Hindu–Arabic numerals, backed up by a slight bit of personal experience learning to encode/decode Roman numerals, which is difficult because nobody today has any substantial amount of practice at it.

Roman numerals are a recording tool, not a calculating tool. Romans did calculations using pebbles or other counters on a counting board. Roman numerals are just a way of recording the state of a counting board before/after performing some algorithm. The goal of them is to be as direct and faithful a record of the counting board state as possible.


> Romans did calculations using pebbles or other counters on a counting board.

I think you just made my point :-)

Have fun doing long division that way.


You can do long division just fine on a counting board, though it is unclear if people had developed something like the modern elementary school division algorithm 2000+ years ago.

We don’t really know much about people’s calculation algorithms, because they were an oral tradition not written down, and only a very small number of counting boards (e.g. made of marble) survived; others were presumably made of leather, wood, cloth, lines scratched in the dirt, ....

Japanese soroban experts handily beat westerners at doing division, in both speed and accuracy. There is no reason to believe that calculation experts of the ancient world would not have been comparably competent.


> You can do long division just fine on a counting board

I searched around a bit, and didn't find much of anything you could do with a counting board beyond division.


Counting boards / the abacus are about as good as you get for calculating methods until mechanical calculators and logarithmic slide rules. So it's not really fair to say the Romans had a disadvantage here when no one had anything better. (Granted, the wire abacus was faster than counting boards / jetons).


Wouldn’t a slide-rule be a similar modern day approach?


An (analog) slide rule is a whole lot faster than doing digital arithmetic. But it’s approximate, yielding about 3 digits of precision, or maybe 4 digits on a large slide rule.

With an abacus or with written numbers, you can get 10 digits of precision (or 50) if you are willing to put the work in.


You can iterate slide rule calculations to get more precision, too... You're not restricted to the width of your abacus nor the number of sigfigs you can get in a single slide rule calc..


That's genuinely interesting, but doesn't really change that it's a terrible system for abstract calculations.


This is not the first time I've seen you writing about this on HN.

Do you have a good source to read more about this? Both about the distinction you're making (recording vs. calculation) and in general about historical capabilities around them? I'm very interested in learning more!


The Romans also used the abacus for numeric calculations: https://en.wikipedia.org/wiki/Roman_abacus


Yes, and in fact it was a seminal discovery that one can calculate (‘calculus’ means ‘pebble’) by manipulating symbols.


I'm not convinced. The filesystem layer on Linux and FreeBSD (and probably other OSs too, though I don't have knowledge) is totally object-oriented and written in C.

Gnome/GTK also encourage an object-oriented style in C, via GLib and GObject.

Also, Greek numerals were just as unwieldy as Roman ones (if not worse), and at any rate the Greeks did not use numbers in their mathematics. There's an enormous amount of math you can do without arithmetic and algebra.

(Edit: Saw your username after I posted this and I want to say that I respect you deeply -- I don't know of anyone else who has written a working C++ compiler almost by themselves. And I like D a lot, though have only had a very limited chance to use it professionally.)


I've seen OOP code written in Fortran-10. It blew my mind at the time (I didn't even know it was called OOP until years later).

But only once. And probably by someone who had learned OOP in another language that was built around OOP.

I was in the C business before, during, and after the OOP revolution. I never saw any OOP attempts in C before C++ came along and popularized OOP. Many C programmers didn't want to move to C++, and were determined to make it work in C. It did work, but the result was kinda hideous and terribly fragile (had to throw type safety out the window with all the necessary casting).

> I don't know of anyone else who has written a working C++ compiler almost by themselves

Neither do I :-) Thanks for the kind words.


O-O is a state of mind and can be implemented in nearly all algorithmic languages, you just need senior people who know WTF they are doing. And I've seen a lot of spaghetti non-o-o code in places like the Google web crawler even though there are class definitions there are no actual objects.


> For instance, people don't do OOP or FP using C.

Oh, but they do. It’s just that their syntax is awful.


> It’s just that their syntax is awful.

I want to expand a bit on that. In the evolution of D, I have a front row seat on how people use it. It's hard to downplay how relatively minor syntactical changes can have a heavy influence on the programming paradigms people select. It's startling.

People will often say "but I can do that, too, in my favorite language X", and they are correct. But they don't actually do it in X because it's inconvenient.

For example, D has a built-in syntax for unittests. That gets pooh-poohed a lot as being pointless. But it's hard to argue with how transformational that has been for D programs. People expect unittests when writing D code. They didn't before. Unittests often occupy more lines of code than what they tested. The addition of a very minor bit of syntactic sugar changed the whole way people write D code.


Yes, I've seen OOP etc. done in C. As you say, it's awful, and so people don't do it. People devise their algorithms and data structures in ways that work smoothly with the language's features.

I'm sure you can compute sine and cosine tables using Roman numerals, too. But it would be so awfully ugly and tedious it's hardly a surprise that few would consider attempting it.


> I'm sure you can compute sine and cosine tables using Roman numerals, too. But it would be so awfully ugly and tedious it's hardly a surprise that few would consider attempting it.

Ptolemy’s table of chords was calculated in base 60 (inherited from Mesopotamians), by probably a Roman citizen living in Egypt and writing in Greek.

It was probably done on some kind of a counting board analogous to the ones used for decimal calculations. Hellenistic mathematicians didn’t do written arithmetic either.

And yes, making such tables is inherently “awfully ugly and tedious”, unless you have an electronic calculator to do it for you.


Yeah, I've seen that table. It was just a few entries, if I recall correctly. Few enough that one could have gotten the numbers by using drawings instead of calculation. And there were errors in it, too.

> And yes, making such tables is inherently “awfully ugly and tedious”,

Yup, but people did make such extensive tables long before calculators, many thousands more entries than that chord table, and far more accurately.


Ptolemy’s table has the chords for every possible angle in ½° increments (360 table entries in all), to 3 sexagesimal digits of precision, or about 5.5 decimal digits of precision. It also had an additional column showing the derivative of the chord function at each ½° at 5 sexagesimal digits of precision, for use interpolating at arbitrary angles in between the listed ½° increments. https://en.wikipedia.org/wiki/Ptolemy%27s_table_of_chords

You might be thinking of Hipparchus’s table (from a few centuries earlier) which only had multiples of 7½°.


> You might be thinking of Hipparchus’s table

Sounds like it.


This is what I've heard. Planet Money has a good episode on how the invention of modern book keeping was significantly influenced by the adoption of Arabic numerals during the Renaissance


As someone else pointed out, this does nothing to explain why Roman-era mathematics seems to have stagnated, compared to the Classical and Hellenistic Greek mathematics that preceded it. Greek mathematicians didn't have access to Indian/Arabic numbers either. (And it mistakes "using numbers" for "mathematics" -- much of Greek mathematics was focused on geometry.)


Romans valued pragmatic skills over theoretical discussions. They clearly knew their math and the numbering system was not an issue for their engineers. They just didn’t care for philosophizing about the nature of geometry like the Greeks did. They preferred someone inventing an odometer or a way to detect tax fraud.


How bizarre. I was wondering if you and I had read the same post here; turns out we hadn't.

The original answer that I read yesterday has since been removed by the mods for some reason.

---

[-] ImOuttaThyme66 points 16 hours ago

[Their number system.](http://storyofmathematics.com/roman.html)

Mathematician in studying here with a side hobby of history. Sources are included throughout.

You may recall from elementary school or personal perusing of Roman history books that they had their own numeral system with letters instead of their own numerals, excerpts of the Latin Alphabet. I is 1, C is 100, and so on.

The Romans did decent with their numeral system. They could add, subtract, multiply and divide with their numbers. However, it was missing two very important principles, if that's the right word, that today's Arabic numbers have.

The first is the idea of zero in mathematics. They knew the concept of nothing yes but they had no numeral for the number "zero." So essentially, try doing your own math homework without using the number zero in the one's place, the ten's place, and so on. Now, zero itself didn't come with Arabic numbers but they did come from an ancestor. [Origins of Zero](https://www.smithsonianmag.com/history/origin-number-zero-18...)

Which led to the second problem with Roman numerals. Their numbers do not work on a positional system. Today's Arabic numbers, 0-9 work on a positional system. That is, we have the one's place, go left one digit, ten's place, left again, hundred's place, and so on. Each digit in a number is the base raised to a certain power. And this is what makes addition, subtraction, multiplication and division so easy to do with Arabic numbers, especially with the concept of zero, a placeholder to place in a certain space when a number has no "tens" or "hundreds" or "ones". Like 109. It has 9 ones, and 1 hundred, but no tens. Without zero, we couldn't be able to write it out like this.

A little sidenote here, technically, the Romans did have a positional system. That is, the greater numerals were on the left and the lesser ones were on the right unless they were using two numerals to communicate numbers. So, it's not an explicit positional system like we do today, in which each digit means a certain number, they did order their numbers around based on size.

Now, in fairness, the Romans simplified their numeral system a little over the years by adding two principles, subtraction and multiplication. What the Romans did was that whenever they placed a smaller number before a bigger number, those two numbers communicated a new number entirely by subtracting the smaller from the bigger. So for example, 109, CIX, the IX becomes worth 9 because X (10) - I (1) is IX (9).

Multiplication, to indicate 1000's, they would put a line over the symbol, and that would be the same as multiplying it by a 1000.

Let's do a little Roman math.

For Roman numerals, 109 would be written out as CIX. Great, we get the amount communicated. Let's say we want to add 32 to this number in Roman numerals. 32 in Roman is XXXII.

CIX + XXXII

Immediately, there's a problem. We have to separate the various numbers. So CIX becomes C IX and XXXII becomes XXX II. Now we can add them together by going "Okay, IX plus I makes X, there's an I remaining, X joins the three other Xs so we now get CXXXXI." aka 141 in today's numbers.

Let's imagine as if we did that with our numbers. Let's add 109 and 32 again. Only, we're adding 100 and 9 and 30 and 2. We know 9 + 2 makes 10 and 1. We know 10 and 30 makes 40. We have 1 and 100 remaining, there's nothing else to add them together so they stay like that. The resulting number becomes 100 and 40 and 1. Exhausting.

So, addition is possible. Subtraction is also possible. You have to go through the entire grouping of bases to do it but it is possible. Multiplication is somewhat possible but very iffy, having to do all those grouping of bases manually.

Division. Division was the hell of the Roman numeral system. The Romans did not have decimals. They had fractions but they did it in duodecimal form, that is, 12ths. They did not have a 1/10th. There was no talking about that explicitly, they preferred to do everything in 12ths. Now, this makes sense, 12 has many factors compared to 10. You can divide 12 by 1, 2, 3, 4, 6 and 12 itself whereas you can only divide 10 into 1, 2, 5, and 10. There are plenty of people who believe we should move to a duodecimal system. Then again, the French attempted installing a decimal system for time during the French Revolution. Maybe not all ideas are popular.

Either way, Romans did not have very explicit fractions. They had the base fractions 1/12 through 12/12, then they continued on from there by dividing them further. Like 1/144 (1/12 * 1/12) or 1/8 (3/12 * 6/12). More information on Roman fractions can be found [here](http://dmaher.org/Publications/romanarithmetic.pdf). Another side fact of their fractions was that they always named them as fractions of something, such as 1/12 of as, which was a currency. Never 1/12 by itself.

So, now that we've gone through the clusterfuck that is Roman numerology, we can pretty much understand why they didn't advance the mathematics field too much. It was functional for their time. The Romans still did many grand engineering feats that were no doubt developed from this number system. However, when it came to further mathematics such as calculus, which would finally be found by Newton and Leibniz independent of each other in the mid 17th century, you can't get any further when you don't have a positional system that makes adding/subtracting a lot easier, no numeral for the number 0, and your dividing tactics do not allow you to do decimals very well, especially when your numerals are 1, 5, and then powers of 10. Decimals could be made possible with these numerals yes but it would be insanely difficult to understand and is made so much easier when you assign every basic integer, like zero through nine for instance, their own numeral.

And that's probably why the Romans didn't do very well in mathematical advancement.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: