Saturday, October 12, 2013

Meta Bits - Meta Space Glasses Revealed, Piece By Piece

If you've been reading this blog, you've probably picked up that I'm super excited about these Meta Space Glasses.  They have huge potential, and assuming they can do half of what the company claims, they could be a major player in the burgeoning world of wearable natural interaction devices.  

But in truth, there is very little tangible information on their website - or anywhere else, for that matter.  There are some basic specs about the hardware listed, a mention that the glasses will be running off of Unity, a beautiful concept video, and lots of articles with a few minor demonstrations.

As a developer, this leaves me constantly guessing.  And I don't really like to guess, I like to know.  So through various sources (hounding the Meta team), I've managed to unveil a few minor details.  With them I've decided to start a list I call, "Meta Bits - Your source for previously unknown minor details about the Meta Space Glasses until they are released to developers in the Winter." Long name, I know.  I encourage anybody with more information to add to this list, as they see fit.

Meta Bits - Your source for previously unknown minor details about the Meta Space Glasses until they are released to developers in the Winter

  • The glasses will be running a Unity-based desktop, and through that desktop the user will be able to open certain files and programs.  For access to all windows-based programs, the team suggests using Unity's own Virtual Desktop (I haven't actually found a Virtual Desktop that can be run in Unity).  Meta doesn't officially support this, though.   
  • What is displayed by the glasses doesn't necessarily move with the movement of the head.  For example, say you are looking at some virtual picture, and you wan to fix that picture in some point in space.  You can do that, and look a way.  When you look back, it's exactly where you hung it. Or, if you want that picture to be fixed in your Field of View, you can do that, too.
  • The device will be using the SoftKinect DS325 for monitoring movements, and the IISU middleware for programming needs.  The SoftKinect does not do body tracking, only short range tracking and hand tracking.  The IISU pro will be shipping free with the Meta SDK (so you'll be saving $750). 
  • They know about the problems Oculus Rift had with Customs, and they're going to try and avoid them.
That's it for now.  But you can be sure I'll be adding to this list in the coming weeks and months.

16 comments:

  1. Thanks so much for the information. I've also been trying to gather as much information as possible.
    1- I don't think "Unity's Virtual Desktop" if there is such a thing has anything to do with this application. A google search returns a bunch of Linux topics.
    2- For AR applications it's important that the augmentations don't move with the head but remain attached to their targets except for notifications and HUD items. In fact to be worth anything in AR it has to be able to do this task extremely well. 3-SoftKinect's accuracy concerns me: http://www.softkinetic.com/Support/Forum/tabid/110/forumid/30/postid/1472/scope/posts/language/en-US/Default.aspx I wonder if Meta has had any success getting better accuracy. SoftKinect basically says it's a hardware issue but I think there could be a software solution. Currently on my AR project (obviously not using Meta's device) we have a lot of software optimizations for what really are hardware failings.
    4-Didn't know. Thanks.

    Some unsolved questions from me: Looks like META is using the Epson BT-100 glasses. These glasses chop the horizontal resolution in half for 3D mode which I'm assuming would be important for most AR apps. This looked to be a OS issue on Epson's side. Has Meta solved this?
    They are using SoftKinetic's short range camera- (I'm glad since I need short range) but what is the far distance for tracking on this camera (SoftKinetic says 1.0 meter but I think that makes any AR application impossible.) You could use just the RGB cam but then your limit your tracking ability.

    ReplyDelete
  2. Hi Benn, thanks for the feedback. Your #3, regarding the accuracy of SoftKinect, is very interesting. I'm definitely going to follow up on that. I'll ask your questions regarding the Epson resolution and the far distance tracking, and post the responses here.

    ReplyDelete
  3. Any news? I've been trying to find anything I can. There is a new article on cnet and a new article on CNN about Meta. I found a site claiming that Meta just went through another round of VC funding (http://www.crunchbase.com/company/meta-view) but they haven't responded to any of my emails for a month which I find a little odd as I think my organization and theirs could be good for each other.

    ReplyDelete
    Replies
    1. Hey Benn,

      I haven't found out anything new. To be honest, they've always answered my emails, but sometimes it takes them a few weeks to do so. This is why I try to catch them when they're "online to chat," which occasionally happens when I go to their site. But recently I haven't succeeded in doing that.

      It would definitely be nice if there were more constant updates out there, but I'm pretty certain they're in "start-up" mode right now, hunkering down and hammering out code / improving the hardware. They're not really looking around and doing PR.

      My gut feeling is they'll have something nice, but not even close to perfect when the dev kits get shipped out. Kind of like the Oculus (nice, but not close to perfect). But all they need is more attention and more credibility to get the money which will allow them to grow and improve at a better pace. And with that growth will hopefully come better PR and outreach to the dev community.

      Delete
  4. Well I have been in contact with Meta concerning their claim about which programming language to use when developing apps for their glasses. On their site it says that all you need is javascript knowledge(Unityscript) but every other device i tried to use in unity3d I could only access the devices through C#. When i asked them that question, the person i talked to told me that he didn't know the answer to that.So he told me to leave my email and that he would pass the question on to their devs and get back to me but i never head an answer back and it has been a week and a half.

    ReplyDelete
    Replies
    1. Interesting that you've brought up the Javascript (unityscript) thing. Their website is pretty sparse on details in general, which I understand because things are probably changing but of the actual details they give that are useful for development, they have this whole section on Javascript. One of my questions was exactly what you are asking but as I wasn't getting answers to even more basic questions I didn't try to ask. On their careers page, they say they are looking for a Unity3D developer with C/C++ plugin experience. For my team's AR application I'm going to have to access OpenCV anyway so we'll be writing our own plugins and yes I think this is easier in C# than in UnityScript. I'd like to start coding our work into Unity from the previous package we are using, but I don't know what language to use C# seems like the obvious script. Their endorsement of UnityScript was a little odd.

      Delete
  5. Here's a letter I sent meta this morning. If I get an answer I'll post it here.

    Good morning Meta,

    I bought one of the development kits and I have a couple of development related questions:

    1. I understand that the Meta glasses use the Epson BT-100 for the screens. Epson says that they decrease the horizontal resolution by half when used in 3D (stereoscopic) mode. This appears to be a software issue on Epson's side. Do your glasses cut the horizontal resolution by half in stereoscopic mode? In producing software for stereoscopic mode, should we produce two 960x540 outputs or one 1920x540 output?

    2. The Depth map and RGB feed are available to use according to the website. What tools, plug-ins, etc. can we use to analyse these feeds? Is this mostly OpenCV? Will there be support in the SDK to pass information between OpenCV and Unity?

    3. Does your Surface Tracking package support tracking of traditional marker fiduciaries? Does Unity have access to marker 6-DOF and marker identification? Do we need to build this functionality ourselves?

    4. Does your Surface Tracking package support tracking of curved surfaces?

    5. Do you know what level of positional/orientational "noise" exists in tracking of objects? i.e. how much jitter occurs in the virtual representations in centimeters/millimeters or in percentage?

    6. Have you thought about using a laser pointer to specify a certain surface in the environment for placing augmentations? In this way the laser pointer becomes a type of 3D mouse and can additionally specify locations on that surface.

    -Benn

    ReplyDelete
    Replies
    1. This comment has been removed by the author.

      Delete
    2. Those are great questions, Benn. I look forward to hearing their answers.

      Delete
  6. I got a response today:


    Hi!
    Thanks for getting in contact with meta
    The 3D resolution is two 480 x 540 displays, so 960 x 540 split in two.
    Depth and RGB is directly accessed, openCV is included, softkinetic iisu middleware is also included.
    Yes we support marker tracking, with 6DOF transforms all in unity.
    Measuring jitter in algorithms is a really great idea and we will implement it in the future to improve our algorithms, we will let you know when we have the first metrics.

    The laser pointer is a cool idea, right now we are focusing on input based with hands, but that could be a great future input.

    Hope this answers your questions,
    Thanks


    My comments:

    1- Looks like the resolution is cut in half for 3D (stereoscopic) mode. That means that we're going to have to compress the horizontal aspect of the two cameras in Unity. The only way I can think to do this is via Render to Texture which is a Unity Pro feature.
    2- Glad to hear OpenCV is a part of the package. iisu may be neat but I have zero experience with it. We'll see.
    3- Traditional markers = yes.
    4- Didn't really answer although journalist reports seem to say that it can.
    5- No real data on jitter. That's ok; it's actually hard to make these kinds of measurements and they tend to change drastically in different environments. I'll have to do my own testing then.
    6- Seems like most of the focus is on hand gestures which is fine.

    ReplyDelete
  7. Hey Matt,
    thanks for this post so interesting.
    I have a question. Do you think that the beta version of Space glasses can be used with a normal smartphone?

    ReplyDelete
    Replies
    1. From what I understood, the dev glasses won't work without being connected to a computer. I also understand that they are working to make future versions of the glasses completely independent devices.

      Delete
  8. I don't know if you have found this video. I hadn't seen it despite looking around.
    It's an old interview with Gribetz but it has more details about certain things.

    http://www.youtube.com/watch?v=d_cjYFAVlrs

    Key things I learned:
    1) The glasses don't work outside or in bright sunlight.
    2) At the time of the interview they hadn't exactly aligned the virtual screen output with real-world view through the transparent glasses. Gribetz had the Epson BT-100's dark plastic opaque cover on because they hadn't gotten the two images to align yet. Newer videos seem to show that they have fixed this.
    3) They do seem to be able to simultaneously track the hands and the floor which is something I really need for my application.

    Also FYI- Meta's twitter feed has an enigmatic statement: "| 12.16.13 | Get ready for the next step." - No idea what that's supposed to mean.

    ReplyDelete
    Replies
    1. Yeah, I had heard that about the indoors-only. They're using infrared sensors which I assume will be messed up by sunlight.

      And I hadn't heard the 12.16.13 thing. That's one week away! Man, I really hope they start shipping.

      Delete
  9. Hi all,
    "| 12.16.13 | Get ready for the next step." has meant, presentation of new MetaPro but also a big delay in the shipment of previous orders. :( :(

    Now I need your help guys, please.
    I need to buy similar hardware to Meta 0.1 to finally start my project because I can't wait until June 14.
    I realized also thanks to this forum, that the camera is the Softkinetic DS325 whereas the glasses are similar to the Epson Moverio BT-100.
    For the camera I'm fine, I can start to use the DS325.
    However I have some problem with the glasses because I just realized that the Epson Moverio can't be connected straight to a computer to be used as a normal output device.

    "EPSON Moverio BT-100 allows you to access and enjoy content from a variety of sources. Moverio offers three ways to transfer content: Download it from a computer over USB, save it on a microSD or microSDHC memory card, or stream over Wi-Fi®."

    How you can see the Epson's description explains that you can just transef content to the glasses and I suppose that content means videos, pictures or android applications.

    Any of you can confirm to me these problems of Epson device?
    At the end, If the Epson glasses are as I said, do you have any ideas on which kind of glasses I can buy to start my implementations waiting for the Meta shipment?

    Thanks

    ReplyDelete