It's a funny thing about putting someone in VR with hand tracked controllers - one of the first things they want to do after discovering they can pick objects up is to throw them. Even for a 'serious' surgical training app like Osso VR, if you disappoint your users' expectation and don't allow them to start throwing things around like a toddler you risk both breaking presence and dealing with real disappointment and frustration.
I learned this first hand the first time I demoed working hand tracking in our app to anyone - after the initial wonder and amazement of being able to pick things up, the reaction to trying to throw something and it just dropping straight down to the floor was a look of disappointment and an exclamation that "this is s**t". I'd already experienced the disappointment myself and had adding throwing support on my task list but it quickly moved up to a top priority.
We're using Unity for our current prototype and I use a PhysX fixed joint to attach grabbed objects to the user's hand (this allows me to use the fixed joint breaking force to prevent held objects passing through solid objects in the environment). Releasing an object destroys the fixed joint but simply relying on the physics engine to do its thing and transfer momentum to the released object if you try to throw it doesn't really work. Earlier in prototyping I simply parented the grabbed object to the hand which results in the same 'dropping straight to the floor' behavior on release.
You could try taking the instantaneous velocity and angular velocity at the moment of release and set those on the released object but what seems to work better is to take an average of the velocity and angular velocity of the hand over the last few frames and set that directly on the Rigidbody
at the moment the object is released. I'm using a simple exponential moving average (EMA) calculated in FixedUpdate()
and directly set the velocity
and angularVelocity
on the Rigidbody
of the grabbed object when it is released / thrown.
Since our hand object is a kinematic rigid body I manually calculate the instantaneous velocity and angular velocity each physics frame in FixedUpdate()
from the transform (rather than getting them from the physics system) based on the delta from the previous frame. The weighting factor / alpha used for the EMA was just tuned until throwing felt reasonably good, there's not much science to it.
Our hands have a small collider in the palm which allows you to move things around with your open hand. I turn this off when an object is picked up and don't turn it back on again after releasing / throwing for a few hundred milliseconds to prevent the thrown object immediately colliding with the hand.
Another thing that helps with throwing (and grabbing / releasing in general) is to use some hysteresis on the trigger values that signal grabbing / releasing objects. Rather than using the half way point on the trigger to indicate grab/release, I initiate a grab when the trigger is being pressed (trigger values increasing since last frame) and crosses around 20% and initiate a release when the trigger is being released (trigger values decreasing since last frame) and crosses around 80%.
Some people manage to throw an object up in the air and catch it on their first try but most people take a few attempts to master it. We have gravity set to about 80% of normal to make it a little easier. One of the Job Simulator devs has pulled off juggling in VR but although I can juggle ok in real life I haven't yet managed to pull it off either in their app or ours...
One final tip: given that people will inevitably end up deliberately or accidentally dropping or throwing stuff on the floor it's also a good idea to have objects return to their starting positions some time after falling on the floor. The Rift in particular does not do a great job of tracking at floor level and it's not much fun rummaging around to pick stuff up off the floor in VR even when tracking is working well.
There's a bit of work and tuning required to get throwing working reasonably well but I think it's well worth it. It all helps with presence / immersion and for our training application we believe that getting people to 'buy in' to the reality of the environment at a deep level will help with retention of the material. It's also just plain fun and is a big factor in making our demo an enjoyable experience even for people who have no professional interest in the medical subject matter.