Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I haven't tried Avy, but I gave ace-jump mode a shot. Maybe I didn't give it enough time, but it just never felt fast for me. The problem is the eye -> hand -> eye feedback loop latency. If I want to jump to a given place, I need invoke ace-jump, wait to read what letter ace jump assigns my destination, press that letter, read the new letter, and repeat until the destination becomes unambitious. It just feels slower than navigating by sexp or whatever.

Maybe if I combine it with eye tracking so I can limit letter assignment to the foveated region I can reduce the cycle count and make this style of navigation win.





Just as an idea: you could try a more complicated setup where physical key positioning matches the movement direction, so Q would always be upper-left. Then at least the first jump you'd be able to do without reading, though then for the final jump you'd still need to react to feedback (or for jumping to an arbitrary area on the screen you could use a fixed "grid" that could also allow doing the first jump blindly)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: