Below you'll find an ever-growing list of tasks that I've either completed or need to complete in order to finish my first Virtual Reality / Kinect game. While this is really my blogging version of thinking out loud, I do hope it ends up helping people who are in a similar position as I am as I first start writing this.
TO DO:
1. Hook up the Kinect and settle on an SDK which binds it to Unity 3D - DONE. Here I explain why I chose what I chose.
2. Understand avatar tracking, and successfully create a trackable avatar of my own
3. Create a split screen view: On the top of the screen I want to see the avatar from a distance, and on the bottom of the screen I want to see a first person view
4. Make it so that when the 1st person view camera moves using the mouse, the avatar head actually moves accordingly.
5. Ensure that the head movements are realistic.
6. Can the head be moved using the Kinect? Try and see.
7. Make another avatar which faces the first avatar, but about 10 feet away (maybe more)
8. Animate that avatar doing a throwing motion on key press.
Hi,
ReplyDeleteFirstly, thank you for creating this blog. It's really helpful for follow developers to learn. I'm in the same situation as you, moving from a web/application development to explore Kinect development.
I'm in the process of switching from Zigfu to Omek as their Gesture Authoring Tool is my requirement as I'll need to cater to a library of custom gestures. Coding up gesture consumes too much time in my opinion.
Initially I'm looking into using Activate3D ICM library with Zigfu for custom gesture creation. But they have been acquired by Organic Motion yesterday....
Please share with me if you have any thoughts regarding creating custom gestures. Such as throwing motion triggering a grenade thrown.
Regards,
Yi Sheng
I'll be honest, I had never heard of Activate3D ICM before now. Like I said, I'm very new to this field. And, as you rightly pointed out, they've now been acquired by Organic Motion (which would explain why all their download links now go to some generic Organic Motion page).
DeleteTruth is, I don't really know what that library does at all.
But regarding custom gestures, like I said before, the Omek SDK has definitely placed an emphasis on finding a nice solution. I even found this link (https://groups.google.com/forum/#!msg/unitykinect/GrjazB4KOQw/YirYf9JOkNgJ), where Amir Hirsch, the founder of Zigfu, suggests a person try Omek when it comes to custom gestures.
The question is whether you use Omek for custom gestures, and Zigfu for the rest (if you want to). I'm not entirely sure that's possible.
Just to share with you since I came upon this.
DeleteA DIY 3D sensor kickstarter - http://www.kickstarter.com/projects/codelabs/duo-the-worlds-first-diy-3d-sensor
I'm looking forward to your next blog post. :)
Thanks Yisheng. Took a look at that link, and it's interesting stuff. It seems there's a new wave of "extremely precise" motion sensors coming out on the market very soon: the DIY 3D sensor, the LeapMotion, the new PrimeSense sensor, the new Kinect sensor (for the Xbox 720).
DeleteOut of all of them, I'm still banking on the Kinect for my games, because although it's clearly not as precise as some of the others, it captures movement from a far greater distance, and it's pretty accurate when it comes to the type of movement I'd expect in my games.
Unity 3D Games is a creative framework for making 3d featured video games and other intelligent substance, for example, compositional visualization's or constant 3d livelinesss. At present, Unity's engineers chip away at Microsoft Windows and Mac OS X, and produce diversions that play on Windows, Mac, Xbox 360, Playstation 3 Wii, ipad and iphone.
ReplyDelete