Friday, February 21, 2020

Medieval Village in VR

This week I managed to populate the village with a few characters that (at least for now) randomly pick a destination and an action to perform (idle, walk or run) for a couple seconds. Then choose another one. In the future I'm planning to start working on creating autonomous agents for these characters, meaning they will be endowed with specific knowledge and the ability to make their own decisions in response to environment's input like, for instance, changes in weather conditions (day/night, rain, cold, etc.), physiological conditions (hunger, thirst, tiredness), and more.
Stay tuned for more on this project and don't forget to check the video below.

Thursday, February 20, 2020

UHV's Provost Dr. Chance Glenn visiting AMETIST

Today we got the visit of Dr. Change Glenn, UHV's Provost. He came to our lab. to check the projects being developed by the AMETIST group, like motion capture, virtual and augmented reality. He took the opportunity to try a VR-based experiment regarding the effects of excessive drinking being developed by the student Itzel Juarez under supervision of Dr. Brent Lang (Psych.), Dr. Rogerio da Silva (CS/Gaming), and Dr. Scheila Martins (CS).

We recorded this moment. Check the video below:

Monday, February 17, 2020

Virtual Reality-based Drinking Simulator

Another ongoing project being developed by the academic Itzel Juarez under supervision of Dr. Brent Lang, Dr. Scheila Martins, and myself is called VR-Drinking. The proposal is to develop a VR-based environment where one can experiment virtually the physiological reactions of excessive drinking. Symptoms like dizziness, drowsiness, loss of balance, visual and auditory confusion, etc. are planned to be simulated. The application is targeting a young audience (e.g. K-12 students) and aim at teaching about the effects and consequences of trying alcohol. Below you can check some snapshots of what's being developed so far (more to come).

Wednesday, February 12, 2020

MOCAP with 3 actors simultaneously

And our team is unstoppable! In fact, we have grown: now we are almost 10 people studying motion capture, virtual reality, and photogrammetry on a weekly basis. This week part of the team managed to add a third actor to a mocap session and record a new animation. You can check the results on the video below.

Thursday, February 6, 2020

Another Photogrammetry experiment

The student Robert Colburn is starting his way on learning how to convert a series of photographs into a textured 3D model using a technique called photogrammetry. Here a snapshot of his first work. More to come...

Wednesday, February 5, 2020

Another milestone: 2 simultaneous actors in the mocap floor

And the students keep learning! Yesterday, we successfully managed to record two actors (Savvy and Zach) simultaneously in the mocap floor. For that to be possible, each actor wears a special unique template on their backs so that the computer can identify them and separate the capture for each skeleton (in real time). We also managed to incorporate two props (katannas). All these combined allowed us to record an entire fight scene of two magical warriors (final animation coming soon!).
For now, here are some photos and videos of this highly productive day.

Wednesday, January 29, 2020

Mocap lab now operating in full capacity

Thanks to Daniel A. Henderson who spent an afternoon studying VICON manual and learning how to calibrate the cameras, now we have 18 cameras "locked and loaded". We are now ready for our next mocap recording session that should take place this afternoon after 4 PM with another student Dennis Barbee. I'm really excited to see what will come up from these recordings. If you are interested in taking part of our AMETIST group just show up or pay me visit at my office UW 254.