I've recently got Rift with touch controllers and setup room scale tracking. And i've found it is about impossible to navigate with the keyboard and mouse while walking in the room. There is just no place to put mouse and keyboard to.
Please can you add the basic movement with VR controllers? This very basic functionality like move forward and strafe should be easy to implement. At least the Unreal engine tutorials I've watched at the moment do this stuff from scratch in less than 10 minutes...
I will be glad to help with beta testing or something to implement this feature. I also plan to try VR development with Unreal engine or Unity and can try to implement this by myself but i don't think the sources are available...
Comments
Please, please add this at least to YL2. I understand that complex projects differ a lot from the simple examples but in Unity examples adding movement with controllers is just matter of a few mouse clicks. May be there is not too many users who use standing mode right now but implementing that is also not seems to require a lot of time while adds really awesome feature...
I've partially solve trouble using this project: https://github.com/rajetic/auto_oculus_touch If anyone wants i can share my AutoHotKey script for YL. I don't post it now as i still trying to enhance it.
At the moment i can not find a way to simulate mouse pointer movements with AutoHotKey. Mouse clicks work fine but something is incompatible with mouse move window messages. May be i should try to emulate joystick?
Is there keyboard shortcuts to toggle POV, snapshot, transparency?