VR Project

Link to Web

Jie Guan

3155359

The Goal

This project attempt to bring the ML-Agents to virtual reality and make a playable interaction with the Artificial Intelligence agent. I have tried to develop many different projects to make multi-agents in the same scene, but I seem that the ML-Agents Toolkit have limitations, so they are not working well. In order to achieve my goal, I choose the soccer example in the toolkit to make the VR project. This project I chose Oculus Rift and VRTK to develop in unity.


All my project file can be downloaded in here.

My Process

First of all, I import Oculus Integration and Virtual Reality Toolkit from Asset Store in Unity, I found this way is better than download their SDK from their website, this method does not need extra setup step and will not have an error showing in the console.

Then, I setup the VRTK Manager and VRTK_SDK Setup in Hierarchy, drag the OVR Camera Rig prefabs in Oculus’ folder as a child of VRTK_SDK Setup. Below the Left Controller Anchor and Right Controller Anchor, I drag the Basic Hand model from VRTK, and we can see our controller become a hand when we wear the VR headset and it includes animation. On the VRTK Manager, we need to drag the SDK Setup to its’ script in Inspector to tell it we are using Oculus OVR Camera and its’ controller.

Now all the basic setup is down when I delete the Blue Goalie and put the Camera Rig to its’ position I can see myself inside this game world while wearing the VR headset. Nevertheless, we need interaction to play this game, so it is time to setup the soccer ball and our controller. I add VRTK_Straight Pointer Renderer and VRTK_Porinter to both left and right controller anchor as their component. VRTK_Straight Pointer Renderer is used to control how the pointer looks like; I make it as standard green. Then, drag it to Pointer Renderer in Pointer script, and set the Activation Button to Grip Press in order to activate the pointer when we press Grip bottom. Now we can see a green pointer when we press grip bottom on our controller.

Lastly, I add VRTK_Interact Grab to both controllers in order to make them interactable with the grabbable object, and setup the Grab Button to Touchpad Press. On the Soccer Ball, I add VRTK_Interactable Object in it to define it is a grabbable object. Now, all the setup is down; we can wear the headset and grab and throw the soccer ball to play this game with our AI goalie and striker.

Experiment

This project brings the 2D Soccer Game to Virtual Reality experiment, players using their physical movement to control the goalie rather than controlling through the keyboard. When I paly the 2D Soccer Game, I feel I can win the intelligent agents easily through using my finger pressing the keyboard. However, when I wear the headset and play this in VR, it is entirely different. As a goalie in the game, I need to move my whole body to avoid the ball to reach my door and use the controller to grab and throw the ball to my striker. In VR, I find that the agents I trained are brilliant; they know how to reach their goal better than me.




Reflection

This project I integrated the soccer example in ML-Agents Toolkit and VRTK, and create a VR playing experiment with Artificial Intelligent strikers and goalie. Through this project, I learn how to setup Oculus Rift with VRTK, and adjust the position in the virtual world. Then, I understand how to replace the model of the controller to a hand model with animation, and how setup the bottom to perform a function such as enable the point line, teleport, grab and throw objects. Although this is a soccer game, I did not have a chance to make it playing with our foot; I only use the controller to create interaction with the ball. In my game, when press the trigger and grip together will enable the pointer line, when this line collision with the ball and hole Thumb stick bottom, the ball will be grabbed and available to throw. In the future, I want to have a full body tracking, and I can use my whole body to engage to this virtual world, and I get a Kinect v2 and had already made it work in unity to bring my movement to a virtual character. I think when I adjust the VRCamera position and make it as a child of the head of the virtual character, all the things will be run as what I expect. Moreover, I want to have a chance to expend the concept of Augmented Virtuality in my future project and continue thinking how to see my body in the virtual environment and make them interact with the virtual object.


Using Format