Wednesday, December 18, 2013

Cancelled my Meta Spaceglasses Pre-Order

...And I feel sick about it.

I don't want anybody to get the wrong idea.  I still think these guys have an amazing product in the works, one which definitely has the potential to be a game changer.  But a number of factors have pushed me to hold off on putting down money on this.  Rather than rehashing exactly what those factors are, let me just post my conversation with Meta on the matter:
META: Thanks for stopping by! Can I help you with anything?
MATT: hi!
yes, first, I love the new site
very nice
and second
I'm one of those preorders, but I understood when I ordered it in september that they would be delivered in december
is that no longer the case?
META: You mean meta1?
MATT:  yep 
META: Meta1 will be shipped at March 2014
MATT: so nothing will be shipped before that?
META: NO.
MATT: and those people who pre-order now, they're not preordering meta1?
also, what is this high power pocket computer? Is this something that comes with the meta 1?
META: Not coming with meta1
only comes with meta pro.
ultra thin see through optics 40 degrees field of view, for both displays, aligned for stereoscopic 3D 720p HD, for each display fully customizable prescription High
fidelity 3D surround earphones The most versatile sensor array: Infra-red time of flight real time 3D scanner Dual RGB(video/photo) Sensors 9-axis tracking
(accelerometer, gyroscope, compass) Dual Microphones The most powerful wearable computer: i5 CPU 4GB RAM 128GB Storage WiFi 802.11n Bluetooth 4.0 32WHr battery .
MATT: i see... Also, is the meta 1 23 degrees fov total, or each eye?
META: no, for both eyes.
MATT: so google glass is 15 degrees fov, and meta 1 is 23
i'd like to cancel my preorder, if that's alright
to whom should i send my information? or could i just tell you and you do it
META: Can I ask the reason you want to cancel your pre-order?
MATT: sure. first, i should say i really think you guys have an amazing product in the works
and i hope down the line, i'll be picking one up
but i think there's a definite lack of communication going on
obviously you guys are extremely busy
but it's leading to a lot of speculation
and one thing i don't want is to pay nearly 700$ for something i thought was different
META: lack of communication means...?
MATT: no updates,  or very irregular ones
no sdk
no response to emails for weeks
no forum
META: I am sorry about that.
MATT: i mean, there's no real way to get good informatioin
it's not your fault
META: we are busy on SDK
MATT: i think everybody understands that
META: we will release SDK on March 2014
MATT: and i'll use it then
but not with the glasses
i mean, here's the other thing
before the dev kit is even created you guys have the next version
and i definitely don't want to buy a crappy version when what is being invested in is version 2
take the oculus rift
they pushed out that nice dev kit
very good
and once they were nearly done with that, they started working on the consumer version/dev kit 2
dev kit 2 will be out soon, and with nearly all the features of the consumer version
META: we use the same sdk on meta1 and meta pro
MATT: but aside from the drastically different fov and the hd quality, and the look and feel, is there anything else missing from meta 1?
i wouldn't want to have a dev kit that is missing a bunch of sensors
and camera features
and the other thing. is $3,000 the price you guys are aiming for for the consumer version?
if that's the case, it's not what I was signing up for
i wanted to develop for something that undercuts glass, not overshoots it
META: meta1 is for developer without wearable pc
metaPro is an ultra high end early adopter device with shipping finished applications. The developer edition is an early release of meta1 for developers to start
building apps with the most advanced augmented reality hardware as soon as possible. Key differences are a much large FOV, much lighter weight product, much broader
array of sensors providing enhanced synchronization between digital content and the real world and a distilled sensor capability allowing us to deliver an even more
ergonomic and aesthetically pleasing design.
MATT: i agree, it looks great
but is that the price you'll be selling it to the masses for?
if so, I'd like to cancel my preorder
META: no
MATT: :) no?
META: final product will have different version let people to choose.
one of them is sell glasses without wareable pc
MATT: how much would that cost?
you see, the cost is VERY MUCH a factor here
i don't want to invest in programming for a device which might not reach the number of people i am hoping for
so the question is, how much less is the non-wearable pc version?
and when would that be available?
META: I can't tell you the actual price, but our goal is to sell final product at least the same price as meta1 or lower
MATT: what is the target date for the final product?
META: still working on it , meta pro will release at Jun 2014, after that, is final product.
MATT: i'm afraid i still think i'd like to cancel. Again, no fault of yours. You've answered everything i asked, and i think the potential is there for a great final
product. But i want to see how things proceed over the next... say, 9 months before putting money in on this
i think there will be serious changes to the product that i will be developing for, and i don't really see a need to start my testing on the earliest version
META: To cancel your request, You can send an email to contact@meta-view.com.
MATT: k, will do. I really hope that you guys succeed with this. it looks amazing
META: Thanks. I hope we will be able to remove your doubts and make a product that surpasses your expectations.
MATT: that would be great. good luck

Saturday, October 12, 2013

Meta Bits - Meta Space Glasses Revealed, Piece By Piece

If you've been reading this blog, you've probably picked up that I'm super excited about these Meta Space Glasses.  They have huge potential, and assuming they can do half of what the company claims, they could be a major player in the burgeoning world of wearable natural interaction devices.  

But in truth, there is very little tangible information on their website - or anywhere else, for that matter.  There are some basic specs about the hardware listed, a mention that the glasses will be running off of Unity, a beautiful concept video, and lots of articles with a few minor demonstrations.

As a developer, this leaves me constantly guessing.  And I don't really like to guess, I like to know.  So through various sources (hounding the Meta team), I've managed to unveil a few minor details.  With them I've decided to start a list I call, "Meta Bits - Your source for previously unknown minor details about the Meta Space Glasses until they are released to developers in the Winter." Long name, I know.  I encourage anybody with more information to add to this list, as they see fit.

Meta Bits - Your source for previously unknown minor details about the Meta Space Glasses until they are released to developers in the Winter

  • The glasses will be running a Unity-based desktop, and through that desktop the user will be able to open certain files and programs.  For access to all windows-based programs, the team suggests using Unity's own Virtual Desktop (I haven't actually found a Virtual Desktop that can be run in Unity).  Meta doesn't officially support this, though.   
  • What is displayed by the glasses doesn't necessarily move with the movement of the head.  For example, say you are looking at some virtual picture, and you wan to fix that picture in some point in space.  You can do that, and look a way.  When you look back, it's exactly where you hung it. Or, if you want that picture to be fixed in your Field of View, you can do that, too.
  • The device will be using the SoftKinect DS325 for monitoring movements, and the IISU middleware for programming needs.  The SoftKinect does not do body tracking, only short range tracking and hand tracking.  The IISU pro will be shipping free with the Meta SDK (so you'll be saving $750). 
  • They know about the problems Oculus Rift had with Customs, and they're going to try and avoid them.
That's it for now.  But you can be sure I'll be adding to this list in the coming weeks and months.

Tuesday, October 8, 2013

Meta Space Glasses, SoftKinect, and a peek into their SDK


One beta version of the Meta Space Glasses
Oh, those Meta Space Glasses look so tantalizing.  Their concept video got me so pumped I went straight to their site and placed a pre-order.  I mean, if what they showed is even remotely possible, the app ideas are endless. The problem is, it has been very, very difficult to discern if what they showed IS remotely possible. Aside from their comment about building it on a Unity 3D-driven platform, there have been very few hints as to how a person could code for this device.

Until today.

By chance, I ran into an article today which said that Meta was going to use SoftKinect to build their glasses. A quick visit to SoftKinect's site confirmed the news:

"SoftKinetic®, the world’s leading provider of 3D vision and gesture recognition solutions, today announced that meta, maker of wearable computing and augmented reality solutions, has selected SoftKinetic’s 3D camera technology and gesture recognition middleware for the launch of its Meta. 01 wearable device. Meta. 01 is a set of augmented reality Space Glasses featuring SoftKinetic’s DepthSense® 325 camera and an accompanying Software Developer Kit (SDK) that includes iisu®, SoftKinetic’s industry-leading middleware for gesture tracking."

So now we know!  We'll be using the IISU middleware to program the gesture-based elements of our Meta applications!  Yay!  And this means we can basically start programming those apps now.

But after the whole Unity Pro/Oculus Rift fiasco, I wanted to make sure this wasn't going to be another business deal which saddles the amateur developer with additional licensing fees down the line.  So I wrote to SoftKinect asking if we'll be receiving the ($750) IISU Pro middleware FOR FREE with our Meta glasses.  This is what they wrote back:

"Yes, our relationship with Meta includes our iisu middleware and will be included by Meta with the glasses as a full development system."

I'm pretty sure that means we won't have to pay anything.  Yay again!!

I was a little surprised to find out from them that the SoftKinect sensor that will be used with the Space Glasses, the DS325, is short range and only does hand tracking, but I suppose I should've expected that.  

At any rate, this was a nice surprise that, after many months of not writing a blog post, I just had to share.

Saturday, May 18, 2013

Ha'aretz cracked - Goodbye paywall, it's now a free for all

SpoilerIf you don't want to hear me rambling, and you just want the Ha'aretz articles for free, drag the link below into your bookmarks toolbar (by holding down the left mouse button and moving it up there), and click on it whenever you're on a Ha'aretz.com article that is asking you to pay.

A very humbling experience, these past few weeks were. As stated in my previous post, my attempt at cracking the ESPN Insider paywall was entirely unsuccessful.  I scoured their javascript code, looking for any sign of imperfection, and all I was left with was a nagging sense of failure.

Which brings me to this post.  I'm going to purge this ESPN Insider fixation by sacrificing a different paywall to the developer gods.  My offering: Ha'aretz.

Now, if truth be told, all Ha'aretz articles, like those on the New York Times website, are already accessible for free relatively easily.  All you have to do is copy the headline of the article in question into Google, then click on the relevant search result and you'll see the whole article.  But for some of us, that's just too many steps.  Hence, my bookmarklet.

As with my previous two successful attempts at cracking online media paywalls (see the New York Times post and the Wall Street Journal post), if you want to read the Ha'aretz "for pay" articles for free, just do as you did before:  Hold down the left mouse button on the link at the bottom of this post (it says, 'Ha'aretz Free'), and drag it up to your bookmarks toolbar.  Then, when you're on a Ha'aretz article which is demanding that you pay, click on that bookmarklet (the link in your toolbar) and the article will magically appear.

Below, the Ha'aretz bookmarklet:

Wednesday, May 8, 2013

Hitting a (pay)wall with ESPN Insider

After over a week of trying, I can now comfortably (albeit, very irritably) say that I cannot find a good way around ESPN Insider's paywall.

But all is not lost.  As with any failed attempt at doing something, I've gained some insight on what doesn't work.  And so I feel it's only fair to share that consolation prize with you fine people.

Things I've learned from trying to hack ESPN Insider, but failing:

1. ESPN isn't stupid.  They have a lot of good coders there, and they don't make dumb mistakes.  For instance, unlike with the New York Times or the Wall Street Journal, they don't accidentally include a URL in their meta tags or script tags which points a person to full version of their for-pay articles.  Meaning, it's impossible to find their article online for free.

2. I can't figure out how or when they populate the article content.  I thought it might be as simple as calling a web service and getting the content that way, but it's not.  Or maybe it is, but i haven't succeeded in identifying where that web service is called and when.

3. Faking being a user through some nifty javascript/jquery doesn't do a damn thing. I'm pretty sure copying a registered user's cookie would solve all my issues, but that's no different than just stealing username/password info.  That's cheating.


And that's it.  I encourage anybody out there to pick up where I left off, and hopefully succeed where I failed.  As for me, I'm giving up.  This site made me waste a week, and that pisses me off.  So I gotta' move on.


Friday, May 3, 2013

Wall Street Journal and my War on Paywalls

SpoilerIf you don't care about the story, and you just want the Wall Street Journal articles for free, drag the link below into your bookmarks toolbar (by holding down the left mouse button and moving it up there), and click on it whenever you're on a WSJ.com article that is asking you to pay.

After creating a bookmarklet earlier this week which cracked the New York Times paywall, it occurred to me that I wasn't being fair.  After all, what did the New York Times ever do to me?  Why was I just targeting them?

So in the interest of fairness, I've decided to target all of them. If you're a newspaper or media outlet with an online (metered) paywall, I'm coming after you :)  UPDATE:  After further consideration, I've decided that such a mission is not only unreasonable, but just too much damn work.

Today's target is the Wall Street Journal.  If you want to read their "for pay" articles for free, just do as you did before:  Hold down the left mouse button on the link at the bottom of this post (it says, 'WSJ Free'), and drag it up to your bookmarks toolbar.  Then, when you're on a Wall Street Journal article which is demanding that you pay, click on that bookmarklet (the link in your toolbar) and the article will magically appear (after about 6 seconds).

Oh, and I'm taking requests.  If there's an online newspaper that you particularly fancy, but they're making you pay for articles after giving you some of them for free (how dare they), then just tell your Uncle Matty here and I'll take care of it as fast as possible.  UPDATE:  After following up on the below request for ESPN Insider (results to be posted soon UPDATE: results now posted), I've decided that it's a hard "no" to all requests.  They take too much time.

Now without further ado, behold, the link (below)...

Monday, April 29, 2013

Ooooh, that tricksy little New York Times...

FINAL UPDATE, 21/11/13:  Forget my bookmarklet.  There's no point, and here's why.  Just open your Chrome browser (if you don't use Chrome, use Chrome).  Once it's open, go to nytimes.com.  Once there, right click on the link of whatever artcile/video you want to read/watch, and choose "Open link in incognito Window."   And you're done.  

I admit, had known about using incognito mode the whole time, but had developed the bookmarklet for all those who didn't use Chrome.  I've come to realize that ...well, you really should just use Chrome.

********************************************************************************

UPDATE, 15/08/13:  Still working on my "super" bookmarklet.  But in the meantime, I discovered something... unusual.  As of yesterday (at least), when the paywall pops up, if you press F5, the page reloads and the paywall goes away.  That probably won't last, but it's working right now.

UPDATE, 14/07/13: So it worked, then it didn't, then it worked it again, and now it doesn't.  This back and forth is making me nauseous.  I have a plan for how to adjust things so that it will permanently work (unless the New York Times changes its arrangement with Google), and I'll post that "super" bookmarklet as soon as it's completed.

UPDATE, 22/06/13: IT WORKS AGAIN!!  But if it suddenly stops working for you, make sure to tell me.

UPDATE, 31/05/13: It would seem the New York Times has made a fix, rendering this bookmarklet temporarily useless.  I'm working on getting a new one up and running as fast as possible.

Spoiler
If you don't care about the back story, and you just want New York Times articles for free, drag the link below into your bookmarks toolbar, and click on it whenever nytimes.com asks you to pay for something.

When the New York Times threw up their paywall a few years back, I must admit I was a bit concerned.  For right or wrong, my news consumption defaulted to nytimes.com, and I was none too excited about the prospect of having to pay for my fix.  But thanks to some enterprising javascript programmer, I was given a two year reprieve.  The bookmarklet, "NYTClean" was a clean little hack which simply wiped away the obnoxious demand to pay for content.

...until February, when it would seem the New York Times programming crew tried to get smart.  Suddenly NYTClean didn't work, no matter how many times I pressed it.

An article on Twitchy.com explained the situation.  The author contacted the New York Times spokesperson, who responded with:
When we launched our digital subscription plan we knew there were loopholes to access our content beyond the allotted number of articles each month. We have made some adjustments and will continue to make adjustments to optimize the gateway by implementing technical security solutions to prohibit abuse and protect the value of our content.
 For their part, Twitchy.com did suggest alternatives to paying, including a new and improved NYTClean bookmarklet (coined, "NYClean").  Unfortunately, the alternatives were either too cumbersome, or they didn't work (like "NYClean").

So I've developed my own solution, which I'm happy to share to all those who equally despise paying for another person's hard day's work.  To use it, simply drag the following bookmarklet (the one below, which says, 'NYTimes Free') onto your bookmarks toolbar and click on it whenever that pesky "Pay Me" screen pops up.



Wednesday, March 27, 2013

Unity 3D + Oculus Rift: Not free... yet

Since the recent announcement by the Oculus Rift team that their Unity 3D plugin will only work with the Pro version of Unity, there has been a HUGE uproar in the Indie development community about what this implicitly means: Mainly, if you want to use Unity 3D to develop games for the Oculus Rift, you'll have to pay the $1,500 Unity Pro licensing fee.

I've noticed that roughly 90% of the vitriolic comments on this issue fall into one of two categories:
  1. People screaming about how terrible it is to force developers to pay $1,500 for a licensing fee, and then threatening to cancel their pre-orders.
  2. People angrily asking whether or not the information is true - multiple times, despite receiving responses.
Although I very much sympathize with the concerns about this unexpected price tag, a few points should be noted regarding the best method to express those concerns.

Regarding the first category, nobody is being forced to pay $1,500 for a licensing fee.  If you don't want to pay, there are other methods of developing 3D games for the Oculus Rift that are free - mainly, the newly announced FREE Unreal Development Kit which is coming out in April.

Regarding the second category of comments, I have but one relatively obvious piece of advice: Go to the source .  The moment I read this announcement, I shot off an email to the Oculus Rift support team and received a quick response.  Here's what they said:
"Hi Matt - currently the occulus rift implementation requires requires image effects (a Unity Pro feature) to distort the rendering path to properly match the hardware, from what I gather, so this is a technical limitation (not a Unity a business choice) currently. Hopefully further down the line occulus rift developers may provide a plug in which does not require these pro features to work, but as it's early days I can only recommend that you raise your concerns with them on their forums."
True, not exactly the answer I was looking for, but it is a definitive answer.  Screaming for a different answer in the forums is just pointless.

Now let me be clear, I am just as disappointed as the next hobbyist developer about the Unity Pro requirement.  After all, Unity is a breeze to work with, and it seems to support more platforms than the UDK.  But we gotta' keep things in perspective here.  The Oculus Rift development kits haven't even started shipping yet (although technically they are supposed to tomorrow), we are all getting four free months of Unity Pro to play with, and there are a lot of very smart developers out there who probably would like nothing more than to score major points by providing the rest of us with a hack to get things working on the free Unity version.

Meaning, relax.  Take what you can get in the short-term, and if history is any indication, we'll get what we want in the long-term.

Thursday, February 28, 2013

Kinect + Unity 3D Game Development - The Great Task List

As I mentioned before, I am a VERY novice game developer. And as such, there are so many things I need to do and learn in order to make progress, it's getting really difficult to keep track of them all.  And so, I believe it's high time I start a good, ol' fashion Task List.

Below you'll find an ever-growing list of tasks that I've either completed or need to complete in order to finish my first Virtual Reality / Kinect game.  While this is really my blogging version of thinking out loud, I do hope it ends up helping people who are in a similar position as I am as I first start writing this. 

TO DO:

1. Hook up the Kinect and settle on an SDK which binds it to Unity 3D - DONE. Here I explain why I chose what I chose.

2. Understand avatar tracking, and successfully create a trackable avatar of my own

3. Create a split screen view:  On the top of the screen I want to see the avatar from a distance, and on the bottom of the screen I want to see a first person view

4. Make it so that when the 1st person view camera moves using the mouse, the avatar head actually moves accordingly.

5. Ensure that the head movements are realistic.

6. Can the head be moved using the Kinect? Try and see.

7. Make another avatar which faces the first avatar, but about 10 feet away (maybe more)

8. Animate that avatar doing a throwing motion on key press.

Kinect + Unity 3D - Zigfu's ZDK vs. Omek Beckon SDK + Motion Toolit

I must admit, I'm a bit uneasy about writing this post.  I've been in a week-long correspondence with the Omek Beckon team about getting their SDK working on my computer, and I they have been more then willing to try and help. 

Which is why I feel bad saying that I've chosen to go with their competitor, Zigfu.

For those who have no idea what I'm talking about, let me provide a little context with a quick recap: I'm trying to create a game which will incorporate Kinect movement tracking and (eventually) the Oculus Rift VR glasses, but I'm very much a beginner at game development. And while I've been programming for quite some time, I find my baby steps into this new programming genre are fraught with numerous challenges.

One such challenge has been deciding how I'm going to utilize my Kinect sensor within Unity 3D, my chosen game engine.  At the time of this writing, there are really only three main options:

1. Write a Unity-Kinect wrapper myself which would bind the device's SDKs to Unity

2. Zigfu's ZDK

3. Omek's Beckon SDK and Motion Toolkit

Let's just nix the first option right here.  I don't feel like reinventing the wheel. So that just leaves Zigfu versus Omek. And the best way to compare the two is to list both of their pros and cons.

Zigfu Pros
1. Has a free development version which is just as up-to-date as the professional version
2. Built off of the most updated versions of OpenNI, NiTE, and official Microsoft Kinect SDK
3. Clearly has an increasingly growing online community happy and willing to provide support
4. Very easy to install and get going
5. A good handful of example scenes that come with the installation
6. Apparently written by some REALLY smart guys

Zigfu Cons
1. In terms of official online support, I didn't see an easy way to contact the Zigfu guys.  But they seem to be active on the unitykinect (or maybe it's kinectunity) Google Group, so if you post there, you'll probably get a response from them directly
2. From what I can tell, no official documentation about their SDK or API

Omek Beckon SDK Pros
1. A development team which is approachable and very willing to help
2. Excellent documentation regarding installation procedures, API reference, and more
3. What appears to be a very good, intuitive API and Motion Toolkit package for Unity
4. They also have a free development version

Omek Beckon SDK Cons
1. It doesn't work on my development computer, and nobody seems to know why
2. It appears to be built off of less up-to-date code, and it doesn't work at all with the official Microsoft Kinect SDK
3. There is clearly less of an online community, which makes finding answers to technical or programming problems a lot harder
4. After getting it installed and working on a different computer, I couldn't get my avatar to jump, and that pissed me off

And so, i've decided to go with Zigfu. Now you might disagree with me, and i'd welcome that. The Omek Beckon guys were so helpful, they deserve another shot. Just not by me; it still doesn't work on my development computer.

Monday, February 25, 2013

Kinect + Unity 3D Glossary: Sorting out the noise

The goal seems simple enough.  I just want to build a 3D game in Unity which utilizes the motion-tracking capabilities of the Kinect and the amazing VR abilities of the Oculus Rift.  Since the Oculus won't actually be delivered to me for another few months, that leaves integrating only the Kinect. Seems simple enough.

The problem is, I have no idea where to start.  I've only dabbled in game development - and by dabbled, I mean I watched all the "Getting Started With Unity" tutorials at Unity Cookie - and I know nothing of motion-tracking.  But I do know programming, which is why I thought things would be super simple. So simple, in fact, that by the time I opened my nice and shiny, brand new Kinect 360 sensor, I had practically deluded myself into thinking that I was just a few tiny drivers away from seeing an avatar of myself jumping up and down on my computer screen.

Yeah, no.

Right after opening that box and seeing the beautiful Kinect staring back at me, I started looking for the necessary drivers/wrappers/etc.  It only took about ten minutes for me to understand how much I don't understand.  The problem was an overload of new terminology.

So to make things easier for me, and hopefully for whoever else runs into the same problem, I'm breaking things down here. In many cases, I'm not necessarily going to paraphrase what was written about some of these glossary terms, because frankly some of it doesn't yet make perfect sense to me.  But I will provide the links I found useful in deciphering exactly what was needed, what wasn't, and why.   


Glossary Relating to Motion-Tracking and Unity

 OpenNI  - An open-source framework for 3D natural interaction sensors. I found a good link here which explained why it would be very good to download and use OpenNI 2.0 as a means of connecting to and utilizing the Kinect.  According to the author of that link, the SDK "gives you access to the raw data provided by the sensor" (in my case, the Kinect).

NiTE - Termed "middleware," it is described by the author of this article  as being the glue which allows the respective sensor SDK to access to higher level "processed" - gestures and skeleton detection and tracking of the sensor. According to that same author, PrimeSense's middleware, NiTE 2.0 has been revamped to work with the new OpenNI 2.0 SDK.

Microsoft Kinect SDK - This is the official SDK created by Microsoft for their Kinect sensor.  I imagine there is a lot of overlap between this and the OpenNI/NiTE, but even if you are using  OpenNI/NiTE, you still need to download this because it comes with some drivers that you apparently can't get anywhere else (mentioned in the same article as before). It should be noted, however, that according that same author, "The latest Microsoft SDK also has more resources and features [then OpenNI/NiTE], and comes with the 'middleware' built in. It also allows you to code in .NET languages as well as C++." 

PrimeSense - A company which specializes in Natural Interaction device development (basically, devices that pick up your movement). These guys are in no small part responsible for the magic inside the Microsoft Kinect, and are now currently producing devices that will be in direct competition with the Kinect. Furthermore, as mentioned before, PrimeSense created the NiTE middleware (just scroll down until you get to the NiTE section).

SensorKinect - Written by PrimeSense, it described by this article as follows: "In simple terms, OpenNI and SensorKinect are the drivers that help to access RGB-D data [Depth & Color] from the Kinect. Interestingly, the latest version of OpenNI (2.0) makes installing the SensorKinect redundant.

KinectWrapperPackage - The KinectWrapperPackage is the Unity wrapper used by this project to bind the official Kinect SDK (version 1) to Unity. I've included it in this list because it came up a lot in my searches.  Personally, I wouldn't use it because: a) it doesn't seem to be updated or continually supported, and b) it is based on an old Kinect SDK and an old version of Unity, and therefore not all of the functionality will work anymore (unless you want to continue developing using the old stuff).

KinectSDK / Unity3D Interface v5 - Another wrapper which binds the Kinect SDK to Unity.  And while this seemed to be a good solution at the time it was written, it hasn't been updated since early 2012, the author has stated explicitly that he has no intentions of updating it, and it was only written to be compatible with the official Kinect SDK v1. But the source code is there, if one is so inclined to try and adjust it to be compatible with more current SDKs.

UnityWrapper - The open-source UnityWrapper appears to be the first official attempt to bind OpenNI to UnityHowever, it was written using previous versions of the various SDKs, and - from what I can tell - hasn't been updated in over a year.

Zigfu/ZDK - Most of the founders and co-founders of this company are former PrimeSense and Microsoft men, so it almost goes without saying that they know what they're doing.  Furthermore, Amir Hirsch, the main founder (from what I can tell), helped author the popular UnityWrapper which integrated the Kinect into Unity.  Using what he built for that, he then moved on and created the ZDK - a more advanced wrapper that binds the Kinect to Unity, and exposes a lot of the power of OpenNI.  There's a free version of the ZDK for non-commercial use, and a $200 version for commercial use.

Omek Beckon SDK - According to the installation guide which you get after downloading the SDK, Omek Beckon is mutually exclusive with the Kinect SDK.  Meaning, if you have the Kinect SDK on your machine, you have to remove it. UPDATE!  I had previously said that one had to download OpenNI and NiTE to get this working, and that's entirely wrong.  One need only read the Beckon Installation Guide to get things moving.  I should note, however, that as of Feb. 24, 2013, the Beckon SDK did not use the most recent version of OpenNI, and they did use other drivers/wrappers that I've mentioned are already are considered older (such as the SensorKinect). On the other hand, they do have some really cool features, such as the Gesture Authoring Tool, as well as some other out of the box functionality from their Motion Toolkit for Unity.

Unity 3D free / pro - A fantastic program which enables even non-programmers to quickly create 3D and 2D games. 

...And that's it!  For now.  I'll add more terminology as I come across them. And if you feel like I missed something important that should be here, tell me and I'll put that up, too.

Sunday, February 24, 2013

Motion-Control Baby Steps

Click here if all you care about is getting the Kinect working with Unity 3D so you can dance like me in the below video.


In my Glossary post, I tried to make sense of all the terminology necessary in order to create a 3D Virtual Reality game using the Kinect, the Oculus Rift, and Unity 3D.  However, just knowing the terminology isn't enough.  You also have to know what to do with it.

When I first got my Kinect 360 sensor, I just wanted to use it right away.  So I downloaded the latest official Microsoft Kinect SDK and... nothing.  I hadn't done any research up until that point, so that's when I had to start.  And one of the first phrases to frequently pop up in my searches was the Omek Beckon SDK. A quick glance at it's features made me think this was a magic bullet to all my challenges:
  • Motion Toolkit for Unity3D allows you to drag and drop ready to use components to quickly add gesture from within the Unity framework
  • Simple-to-use extension for developing in C# and .NET frameworks
  • Flash wrapper for quick development
  • Gesture Authoring Tool for the creation of custom gestures in minutes
  • Ability to track up to 5 skeletons simultaneously
  • Support for multiple camera positions
  • No calibration required
I admit, my first reaction was this was going to be too easy. I mean, what's the fun in learning how to program the Kinect into my game if I don't actually do any programming?

But using the magic bullet wasn't as easy as I thought.  After removing the previous SDK that I had downloaded (the official one), I was able to successfully install the Beckon SDK (the microsoft SDK and the Beckon SDK are incompatible).  To use the SDK with Unity 3D, I next had to import from the Asset Store the Motion Toolkit. It was then time to start up Unity and run through the sample scenes.

This is where I hit some problems.  When the sample scenes used my Kinect, many of my movements weren't being picked up.  In fact, the only movements which were picked up were moving right and left and forward and back. 

And I was suddenly at a loss.  After all, I knew nothing about ...well, pretty much anything, so it was impossible for me to debug this.  So I hit the forums and tried to find other people with my problem.

Unfortunately, this version of Beckon hasn't been out for too long, and there wasn't exactly a wealth of helpful material online.  ***UPDATE! --getting Beckon installed and running has been a bit problematic for me.  I'm currently in contact with the Beckon team on the matter.  Once it's installed and I've played around with it a bit, I'm going to write a post comparing it to Zigfu's ZDK, which I talk about below.

Following another lengthy visit to Google - during which I compiled the Glossary - I discovered this very helpful little video.  Not only did it have good background music, but it provided very nice instructions on just how to get the Kinect motion tracking working using nothing but OpenNI 2, NiTE 2, and the most updated official Kinect SDK (as of the time of this writing), and not Beckon.

So now I had motion tracking working!  Yay!!  ...but I had no idea how to translate that to something usable within Unity 3D.  Back to Google.

At this point, I kept running into search results regarding something called the UnityWrapper, written by Amir Hirsch (apparently an extremely smart fellow). This enabled people to incorporate Kinect functionality within Unity 3D games.  The problem was, the last time that wrapper was updated was early 2012.  However, the Read Me there pointed me to a Zigfu repository on github, so there I went. This quickly brought me to Zigfu and their ZDK.

I must admit, I had come across this page far earlier, in some of my initial searches, and quickly dismissed it.  At first, it seemed like a quick fix which costs money.  But as it turned out, I hadn't read the fine print, as it only cost money for commercial use.  And since by this time I was getting impatient, I decided to give it a try. But before doing so, I wiped my computer of any remnants of previous Kinect SDKs/drivers, as per a piece of advice I read in this article, which stated the following:

These libraries usually do not play well with each other, with each of them requiring their own driver’s and dependencies for using them. There are options to bridge these gaps but in general terms, it will be necessary to completely remove any legacy or conflicting installations before switching from one to the other. That includes drivers, .dll’s, and registry/environment path settings.

While the ZDK download page doesn't explicitly state what needs to be installed prior to using the software, I found this elsewhere on the site:  "Like all the ZDK offerings, the ZDK for Unity3D works with both Mac and PC, with OpenNI/NITE and the Microsoft Kinect SDK, and with all consumer-level 3D sensors."  So I figured it went without saying that I needed to install the latest OpenNI/NiTe, and the latest Microsoft Kinect SDK. 

Which meant I didn't actually have to uninstall the other SDKs, but whatever.

After finishing the reinstalls, I opening up Unity, imported the ZDK package, opened up a sample scene - the avatar one looked nice - and ran it.  And... it worked!! It worked fabulously!!

So now I feel obliged to do two things: First, thank the ZDK, OpenNI/NiTE guys.  Awesome job.


Second, I should probably summarize, in just a few words, what I actually did to get things working.  Here it is:

Quick Guide To Getting Kinect To Work in Unity: 
1. Download the latest versions of OpenNI, NiTE, and the Kinect SDK, and install them (in that order) 
2. Import the latest ZDK package into Unity 
3. Go have fun.


I should note, the method I chose was in large part influenced by the fact that I didn't want to shell out money for anything. If you don't care about money, the Beckon SDK may be for you; or the pro version of the ZDK.

K, now it's time to end this post and go reverse engineer these ZDK sample scenes.

In the beginning...

I'm starting this blog at a moment that I view as a crossroads.  I've spent many years honing my programming skills primarily as a web applications developer, which has been more than fine for me.  The languages and technology I've used up until now have been plenty diverse and challenging, so I felt no reason to drastically switch gears.

Until this came along.

The Oculus Rift - the greatest thing to be invented since God


I have been obsessed with Virtual Reality since that weird Pterodactyl game came out in the mid-90's. I still remember my disappointment when Sega announced that they were not going to be releasing their Virtual Reality goggles due to some ridiculous health concerns (I think it had something to do with blindness).

And yet that obsession slowly waned as the technology seemed to inexplicably drift farther and farther away. I say inexplicably because to me, the technology was so amazing, it was perplexing why more people weren't trying harder to make it practical.


But they are now.  The Oculus is here, and my inner obsession is back with a vengeance. Except now I have years of programming experience under my belt.  Which means not only is now the time to get into game development, it's time to get into 3-D, virtual reality game development.

Oh, and since the motion-tracking technology of the Kinect is cool, too, it's time to get into that, as well.

Which is why I consider this a crossroads.  I'm switching programming gears, and jumping into an aspect of the field that I'm pretty sure I know very little about.  And since I'm certain there will be significant bumps along the way, now's the perfect time to start documenting.  After all, if it weren't for all my peers who documented their own struggles and solutions, I wouldn't be the programmer I am today.

Now I should be very clear here, I am NOT a prolific blogger.  I've tried blogging before, twice, and I've failed miserably, twice.  However, things are different now.  First, I'm no longer trying to blog about cartooning (in a previous life I was a professional amateur cartoonist); and second, I'm DEFINITELY not trying to blog consistently. I had read somewhere awhile back that in order to be successful at blogging, one had to be consistent; that only by blogging every day or so would one create and maintain a growing readership.

And so, when I started my previous two blogs, that advice was taken to heart.  ...in the beginning.  And then, relatively quickly, I started to slack.  Eventually, the only thing that I posted consistently were apologies for not posting consistently.

So let it be known right now that this blog is not going to be consistent.  I will post something when I want to, and that's it.  I will not promise massive ten-part tutorials, and I will not care about maintaining a readership base. 

But, as I explained clearly in my Mission Statement, I will post.  After all, it's time to be part of the dialogue, and not just an active listener.