Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This reminds me of my professor's (probably very poor) description of NP-complete problems where the computer would provide an answer that may or may not be correct and you just had to check that it was correct and you do test for correctness in polynomial time.

It kind of grosses me out that we are entering a world where programming could be just testing (to me) random permutations of programs for correctness.



Well we had to keep increasing inefficiency somehow, right? Otherwise how would Wirth's law continue to hold?


Most of the HW engineers I work with consider the webstack to be far more efficient than the HW-synthesis stack; ie, there's more room for improvement in HW implementation than in SW optimization.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: