Home › Forums › General Questions › Squeezing the trigger in VR
- This topic has 9 replies, 3 voices, and was last updated 3 years, 9 months ago by ocularvr.
-
AuthorPosts
-
2021-03-09 at 10:43 am #39248ocularvrCustomer
Good morning! I have one question which has been hindering my progress tremendously, how can I pick up an object, or even just register a trigger squeeze. Please help as I am really trying everything and are out of ideas and options. None of the attached work, they just cause the browser to hang indefinitely.
Attachments:
You must be logged in to view attached files.2021-03-09 at 12:03 pm #39252Alexander KovelenovStaff2021-03-09 at 12:43 pm #39254ocularvrCustomerHi Alexander, I changed the puzzle to rotation (45 degrees) and the same result. I would like to know how I can get the Squeeztrigger event to register, just a print to console would do.
2021-03-09 at 8:04 pm #39287Alexander KovelenovStaff2021-03-10 at 12:03 pm #39305ocularvrCustomerHi Again,
I tired all possibilities and am just able to get the squeez-trigger event. Please see the attached steps. Not sure what else can be done. My finality is to pick up an object by squeezing the trigger and moving it around until trigger release. I am a long way away. I am on 3.6.0 and Windows 10 with Chrome/Firefox.
Thanks for your patients with me.Attachments:
You must be logged in to view attached files.2021-03-11 at 4:36 am #39316CrunchCustomerHow are you checking in the console debug window in VR mode to see if your squeeze event fired?
Answer: you are not in VR.
I haven’t had the chance to figure out proper debugging in the oculus browser but I believe it involves using a USB cable (at least at first). Which is probably why I have been avoiding it (**if anyone has a guide on this subject it would be appreciated if you shared it!**)
If you need a simple debug, like printing “squeeze event happened!”, what I did (since I am too lazy to get out a usb cable) was add a text object to my scene, then use update-text-object puzzle to change the text object to whatever my console print message would be.
If you set it up on a fire on every-frame puzzle, be warned your frame rate will get crushed as converting text to new objects is a heavy load on the system.
I haven’t looked too closely at your example puzzles as my guess is you upgraded then forgot to switch your default directory in your SSL xampp server setup you’re testing from? I know I have done it before where change after change and refresh after refresh seemingly has no effect until you realize you are loading the right file name, but in a different location.
If that’s not your issue, continue to trouble shoot. The update-text method I mentioned above should help.
Unless you cannot even get into VR mode to check (meaning its hanging on launch)
1. listen to Yuri’s #3) Try running some VR-capable app which is known to work, say Industrial Robot.”
If the robot demo works AND you can confirm the html file you are refreshing in your Quest 2 to is being loaded from the same root directory on your test server as the working robot demo, then clear/delete your cache on the browser in your oculus. Be sure you are hard refreshing the html file after each change you make in your app.
2. If that doesn’t work, disable disable disable. Back out each puzzle (disable it), in the order you added them – refreshing each time until VR mode can load without hanging.
3. If you still cannot get into VR mode, disable ALL your puzzles and then, using the robot demo (or the user manual) as a reference, emulate every puzzle listed in the those working examples.
3. if that doesn’t work, and you have gone so far to emulate everything down to the exact same characters in the files from the ‘working’ demo from the developer and your version, I supposed we are all FOOKED!
J/K, but seriously, you are just probably doing something wrong.
Trouble shoot and get back to me, not Yuri or other soft8soft crew. We need those guys in the lab 100% focused on making us sweet new gadgets 😊
2021-03-11 at 9:36 am #39350ocularvrCustomerDear Crunch: Life Saver!
OK I will follow your advise to the millimeter. Specially the bit about not bugging the team because the last thing I want is to be a pain in the ***.
My issue: I want to recreate some of our projects on Verge 3d. We use Unreal and Quest. So from my point of view its all possible, just need to find a way and that’s where I really appreciate your help.
My objective: Understand what inputs I can use from the Oculus Quest. Pick up an object, drop an object. Throw an object, etc.
My Oculus is connected via a USB C cable and im running the Link.
I can read both controllers axis 1 and 2, see the attached diagram.
What I cannot do is read any of the buttons. I cant read anything in the Squeeze Event also as you saw in the previous screen shots.
The Robot demo has nothing in terms of navigation or buttons, its all on hover actions. Its done differently. I would like to be able to pick objects up and move them around. There is a demo but I cannot get it to work as the Squeeze start function will just not work for me. I cant get this work at all: https://www.soft8soft.com/verge3d-3-1-for-blender-released/
Thanks for any help.
2021-03-11 at 10:19 am #39364ocularvrCustomerJust wanted to update the diagram.
Attachments:
You must be logged in to view attached files.2021-03-11 at 5:50 pm #39392CrunchCustomerGreat Scott! Love the updated diagram..
I am out-of-time with a current work project so I won’t be able to post any puzzle examples today, but here is a screen capture video of a demo app I did that has what you are looking to do.
To get there, you are going to want to start using the physics engine to control player movement. (physics becomes mandatory if you want to be able to grab and throw objects around in your scene)
So instead of using direct controller inputs to move around, you let the physics engine translate thumbsticks motions into vector impulses that get applied to your camera rig (aka a cube with an empty parented ontop which the camera mounts on)
This makes motion much smoother in your headset versus being tied direct to controller inputs. Plus, by making your character body a dynamic physical object and the environment (walls, ground etc) static, collisions are effortless. (meaning you can’t walk through walls like a ghost)
I followed this guide as a starting point.
https://www.soft8soft.com/docs/manual/en/introduction/Physics-Guide.htmlScreen shots of my puzzles will likely just confuse you as they are a hot mess full of experiments, and I frankly already forgot most of what I did to get it working.
Yuri did posted a Cube VR demo project, which was very helpful reference.
Review this thread
I’ve been meaning to collaborate a bit with Yuri and ask him to create a template puzzle (that can ship with future verge releases) which gives you drag and drop access to a complete set of physics based oculus controls (assuming your scene has a character body & head objects setup) including setup for default oculus “hands” or slots to add custom controllers (like my golden lobster pincers in my demo video– which articulate on squeezeevents (uses a simple shape key) and if there is an object within distance X, it changes the picked object from dynamic to kinematic, then snaps and parents the object to controller hand. On squeeze release, object gets converted back to a dynamic object and unparented, so you can throw it. Easy cheezy.
I actually will attach a snapshot of some puzzles – please note I don’t think the ‘test rotate’ procedure is correct. I believe the first IF statement was my attempt to use physics to drive rotation that I never got working, but the ELSE IF, where I directly rotate the camera rig from controller input did work.
Again, as I mentioned before, that rotation (driving directly from controller input) is not ideal as it makes motion kinda of jenky when you rotate. Or maybe I am just doing it wrong?
Yuri, if you are reading, my idea to add physics driven rotation, since the main character body is locked with an angular factor vector of 0,0,0, was to attached a secondary body ontop of the main character body that was parented/constrained somehow to the main body where it could only rotate.
Think of the turret ontop of a tank. Something like that. What do you think?
Ocularvr sorry I can’t be of more help today. Its going to take me a day or so to refresh myself on what exactly I did to get all that working.
Some tips I do remember.
Make use of the Traverse controllers puzzle, which I think iterates across both controllers. If you code assuming say your right hand controller is always index 1, that might not be the case. I THINK (not sure) oculus might assign indexes in the order the controllers are connected.
So say when you turn on your headset and activate your left handed control first, it might get assigned the index of 1. (or 0? Whatever is first). So be sure when testing to try the clicks on the other controller if nothing seems to be happening.
I will try to get back to doing some more VR stuff this weekend. Good luck!
Attachments:
You must be logged in to view attached files.2021-03-11 at 6:18 pm #39394ocularvrCustomerRoads? Where we’re going, we don’t need roads :)
Thank you so much for that awesome response. I will spend all of next week on it. What I do need and just cant get working is mapping the buttons like as in the diagram I showed, there my biggest problem is not being able to detect the SQUEEZE START event. Ill keep trying.
One more observation, regarding using physics to pass vectors to the camera, I do this in Unreal where its called using a Spring Arm, allowing to swing behind the first-person. Its nice but for use over 5 minutes it tends to cause motion sickness. We found that not having this, IE just having a fixed camera is much better for motion sickness. I use it in a given moment where I want to highlight what ever is going on.
So to summarize Im stuck on detecting the trigger squeeze :)
Regards and thank you!
-
AuthorPosts
- You must be logged in to reply to this topic.