Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So the weird thing is that we're not only a lab, but also a web tool. We have the files backed up in a standard format in one place, but delivering 1024x1024x128 cubes of images over the internet has been tricky. We don't need people to always view them at full fidelity, just good enough.

We tried JPEG2000, which was better quality per a file size, but the web worker decoder was slower than the JPEG one adding seconds to the total download/decode time.

EDIT: We're currently doing 256x256x256 (equivalent to a 4k image) on eyewire.org. We're speeding things up to handle bigger 3D images.

EDIT2: If you check out Eyewire right now, you might notice some slowdown when you load cubes, that's because we're decoding on the main thread. We'll be changing that up next week.



Yeah jpeg2k sucks. It doesn't seem to do anything very well. Design by committee ruined it by making it waay to complex.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: