Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Store your monetary values as cents instead of dollars and use integers. Convert to dollars in the presentation layer. My assumption here is that you're dealing in US currency but the same idea applies to any currency; use integers, not decimals.


Say you have a savings account with an interest rate of 0.125% and any interest has to be converted into another currency for taxation purposes at a rate of 0.88619.

Working around these sorts of problems with integers is just not worth the trouble.


And yet the proper way to handle monetary values has always been to operate on integer values with a predetermined precision.

https://stackoverflow.com/questions/3730019/why-not-use-doub...

I can understand the convenience if you are setting up a simple online store, but more complicated cases (like the one in your example) absolutely need a proper handling of decimal values.


Can you explain to me what particular cases postgresql's decimal type cannot handle properly _and_ conveniently?


> Can you explain to me what particular cases postgresql's decimal type cannot handle properly _and_ conveniently?

I entirely agree that numeric is a good solution for this case.

But to try to answer your question nevertheless: I'd say there is potential for storage density and math performance issues. You really need to have a number of numerics for each row, and a lot of rows to make storage density matter. If you do a lot of aggregates over numerics you definitely can notice the CPU overhead of them over say a plain float. But most of the time that doesn't matter.


I agree completely.


Yes it is. You convert to a big decimal representation, compute the tax/interest and then intentionally round back to an integer format using the appropriate rounding mode.


GP didn't say it's impossible, he said it's not worth the trouble which is an opinion statement that can't be countered with "yes it is".


Yes, it can. The case and point is that you need to control where those moments of imprecision happen, and using fixed point a fixed point (often integer/long) representation is absolutely worth the effort because it's not much effort at all and you DEFINITELY care about when your money calculations involved rounding.


And why would I not prefer to get help from a database system to do that?


If that's the only reason to use a "real" database, then what you have is a 100MB+ multi-process (and potentially costly) library to perform a few calculations.

(I don't disagree with you, just answering your question)


Of course there are many other considerations, but there is a class of small finance related multi-tenancy systems for which monetary calculations are key and either a database server or a database library can potentially make sense.

Resource consumption isn't an issue at all. 100MB+ is less than loading an average news website in a browser and it costs almost nothing. The reason why I would have wanted to use SQLite sometimes is that it makes it easy to distribute and run the entire app on premises if needed. People are rightly concerned about losing access to web based software.


There are also also many use cases where you don't want to "store" something at all, just compute the value (eg: a shopping cart that is not yet a purchase), and you don't want to to either 1. duplicate implementations or 2. wait for a network request to your database to complete.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: