Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Imagine that any computation is a hash, then every possible thing becomes memoized not distinguishing between data/code. Then as a consequence you have durability, cache, security to an extent, verifiability through peers (could be trusted or degrees away from peers you trust).


Is every computation worth memoizing? I can think of very few computations I do that others would care about, and in those cases there's already a much more efficient caching layer fronting the data anyway.


Why not? I think there is some interesting research here at the computationally level / distributed that could lead to some interesting architecture and discoveries.

Fully distributed OS's/Virtual Machines/LLM's/Neural Networks

If LLM's are token predictors for language, what happens when you do token prediction for computation across a distributed network? Then run a NN on the cache and clustering itself? Lots of potential use cases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: