Videos

Final Project Presentation

Final Project

In-Progress Review Presentation

Design Review Presentation

This presentation covers the prototyping our group has done. This video marks the beginning of the transition from  the prototyping stage to the implementation stage.  This video shows the finalized flowcharts for the software and hardware behaviors of our project. This video also covers the steps taken in prototyping our equipment. Demonstrations of both controller methods are discussed. Also, applications of Unity for module creation and extraction of data from the EPOC+  highlight the backbone for this project. This video concludes with a discussion of the steps we plan to take to complete this project on time.

Project Proposal Presentation

This video is our original presentation discussing the necessities for the project. We discuss the importance of this project and the necessary requirements for it’s completion. We outline an approach for the completion of this project, as well as the possible alternative approaches. This presentation provides the structure for our prototyping discussed in “Design Review Presentation” found above.

Importing Emotiv Data Into Unity

This video demonstration shows the functionality of using EMOTIV data within Unity. In order to obtain this data, our Unity project must subscribe to the EMOTIV Cortex API. There are two main functions that improve ease of use, both demonstrated above. First, a sensor map can be accessed, allowing the user to see the contact strength of each sensor. This data is viewable in the EMOTIV app; however, this allows us to consolidate the EMOTIV data and modules into one location. The second function demonstrated revolves around implementation of data. There are three sections that data can be obtained from. The EEG section provides the raw data input from each sensor, allowing for sensor specific data to be accessed. The second section provides gyroscope data from the headset. This section provides many potential benefits, allowing the user to look around the virtual environment using head tracking. Lastly, the Cortex API allows for mental performance measurements within the scope of: stress, engagement, interest, excitement, focus, and relaxation. This can be used to verify raw EEG data and provide an understanding of the physiological state of the user.

Jumping Scenes Using Collision Boxes

This video is a demonstration of scene navigation using a jump scene C# script. Both module selection and settings menu implementation are shown in above.

Leap Motion Interaction with Custom Unity Objects

This video is a demonstration of our testing with creating objects in Unity that can interact with the Leap Motion controller. In this video, the desk serves as the ground plane because it has no interactive element. On the desk there are two objects, a black cube and a white cube. Using interaction behaviors, we are able to change the order these boxes are stacked in. Also, we can throw the items by releasing our grasp while the Unity hands are in motion.

Leap Motion Controller Diagnostic Testing

This video is a demonstration of the hand mapping used by the Leap Motion controller. The capability of the Leap Motion to map hand movements from multiple angles allows for flexibility in it’s use. Also, the leap motion maps the hands at specific locations, allowing for it to determine different hand orientations. The Leap Motion is the our alternative option for controlling the learning modules.

Unity Environment

The structure for our learning modules revolves around 3D environments created in Unity. Because none of the members in our group have any experience with scene creation, we created a test environment to learn the basics. This video demonstrates the creation of a classroom environment in Unity. To create this environment, we learned about the creation of shapes, lighting, implementation of prefabs, and environment positioning.

VR Headset Modeling

If using a consumer market VR headset does not allow for proper sensor placement of the EPOC+, we plan to model our own 3D headset. This video demonstrates the created 3D model. We currently have four components: the headset casing, the back panel, the lens holders, and the LCD clips. If the Leap Motion is implemented for use in our project, we need to design a mount for it on the headset. If this design is used, we must find a way to add the lenses and straps to allow for proper headset use.

3D Modeled Controller

Our primary approach for interaction with the VR environment is a custom made controller. This video demonstrates the key characteristics of the controller. The main casing is large enough to fit a Raspberry Pi Zero and an ADXL345 accelerometer. Two buttons are also mounted externally on the controller. A power chord must also be plugged into the side of the controller.

Math Problem EPOC+ Signals

 This video demonstrates a test of the EPOC+ while concentrating on a map problem. Both graphs in this video represent the AF3 and AF4 brain channels.

Playing Game EPOC+ Signals

 This video demonstrates a test of the EPOC+ while playing a video game. This test was implement to show concentration. Both graphs in this video represent the AF3 and AF4 brain channels.

Reading a Webpage with Distractions

This video demonstrates a test of the EPOC+ while reading a webpage. Group members attempted to provide distractions during the test. This test is intended to demonstrate brainwave patterns stemming from changes in focus. Both graphs in this video represent the AF3 and AF4 brain channels.