If they can get it scaling decently, it's a good achievement in its own right. Say that JS with a modern JIT is 4 times slower than C/C++. This means the latency of their system will look relatively better by a factor of 4.
If they now find an algorithm that scales to 2000 nodes with an efficiency as low as 20%, this is a breakthrough achievement. It would be portable to a C/C++ based program with "only" a 4 times faster interconnect.
That said, an engine with a primitive search scales better because the tree is more regular and it's easier to predict the real size of workloads. It could end up that the algorithm doesn't actually work for a strong engine.
But anyway, the "mere" 4x factor due to JavaScript is peanuts compared to the rest of the problems.
If they now find an algorithm that scales to 2000 nodes with an efficiency as low as 20%, this is a breakthrough achievement. It would be portable to a C/C++ based program with "only" a 4 times faster interconnect.
That said, an engine with a primitive search scales better because the tree is more regular and it's easier to predict the real size of workloads. It could end up that the algorithm doesn't actually work for a strong engine.
But anyway, the "mere" 4x factor due to JavaScript is peanuts compared to the rest of the problems.