Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It almost never matters in scientific computing. Doubles give us the equivalent of almost 16 digits of accuracy, and that's more precision than we know any physical constant to. You're right that the world isn't decimal, and switching to decimal encodings actually reduces the effective precision of any computation.


There's a reason they're called the natural numbers. Nature doesn't have to be decimal for decimals to be useful (the question that started this debate), it just has to be rational. Many many many parts of nature are rational, and sometimes we need to deal with them in scientific computing. DNA sequence processing comes to mind.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: