Tag Archives: motion capture
Multi-user Wireless Virtual Reality System: Real Virtuality
A couple of years ago we checked out a VRcade, a virtual reality arcade that freed players to move and translated their movements in the game with the help of motion capture systems. Artanim’s Real Virtuality is a lot like that, except it’s geared for multiple users and not just for games but for other content as well.
Much like VRcade, Real Virtuality uses cameras mounted around a room to track markers placed on the users. Players are able to move thanks to the computer and wireless hardware stuffed in a backpack.
While I’m sure the VR experience itself is very convincing in person, another impressive aspect of Real Virtuality is its extremely low latency. As you’ll see in Tested’s demo and interview below, the experience is seamless enough that users can easily make coordinated movements such as a handshake and even toss objects to each other.
Perhaps the pessimistic prophecies about virtual reality where we just sit or lie down all day lost in our individual bubbles won’t come to pass. Maybe these fake worlds and barriers will encourage us to move and connect here in the real one. Until of course our smartphones make their way into VR. Check out Artanim’s website for more on Real Virtuality.
[via Tested]
Ant-Man’s Villainous Yellowjacket Unveiled
Heddoko to demonstrate its First Fully Integrated Garment Live at the Wearable Technology Innovation World Cup
Inversion Project Lets You Go Wireless with the Oculus Rift: Kinectic
Last November we heard about VRcade, a virtual reality system that lets the user move around while wearing a VR headset, thanks to wireless wearable electronics and cameras. A company called Zero Latency is working on the Inversion Project, a very similar setup for VR poster child Oculus Rift.
Details are scarce about the Inversion Project, but I’m going to bet that it also requires cameras or motion sensors aside from the hardware that’s worn or carried by the user. The video below demonstrates the technology with the help of a simple zombie game disappointingly called Zombie Fort: Smackdown and not Rift 4 Dead.
Zero Latency will demo the Inversion Project on Feb. 16 at Melbourne Australia’s Pause Festival. Hopefully details will trickle out of the event soon after.
[via PSFK]
FaceRig Turns You into a Digital Avatar in Real Time: Self-e
Here’s a program that could be one of the big hits of 2014. Currently in development by Holotech Studios, FaceRig lets anyone with a webcam project their head movements and facial expressions onto a virtual character, all in real time. It’s Dance Central for your face.
According to Holotech Studios, FaceRig is based on “real time image based tracking technology” made by Swedish company Visage Technologies. Aside from tracking and mapping your head and face, voice alteration will also be included in FaceRig. So you can become a voice actor, a motion capture actor and an animator all at once.
So what can you do with the FaceRig? For starters you can stream a show online using your avatar as your visage. You can be the next Hatsune Miku! Or rather, Half-sune Miku. You can make a simple animated film without spending a single second or cent in 3D modeling software. Or you can just make funny faces all day.
Holotech Studios plans to release several versions of FaceRig for different devices and use cases, such as a full featured desktop program for professional use and a mobile app for funny face use. For now a pledge of at least $5 (USD) on Indiegogo will be enough to score you both a beta and a full license to the basic version of FaceRig.
[via Incredible Things]
VRcade Combines Motion Capture with VR Headsets: The Arcade is Dead, Long Live the Arcade!
A new company called VRcade aims to revive the idea of a gaming arcade with the help of virtual reality. Whereas VR headsets like the Oculus Rift need to be wired to a computer to work, VRcade’s headset has a wireless transmitter. Why? Because VRcade isn’t just a headset, it’s an entire room. Or even an entire floor. When you move in the real world, you move in VRcade’s virtual world.
Aside from its wireless headset, VRcade uses motion capture cameras and a modular motion capture suit. In addition to the suit, there are also markers on the headset and whatever prop you have – like the gun in the image above – that the cameras can use to track your movement. In other words, while other VR headsets can track only your head, VRcade tracks you.
VRcade has several advantages over what VRcade CEO and co-founder Jamie Kelly calls “virtual sit down gaming.” VRcade’s games will encourage player movement: walk, run, sneak or jump in the real world and you do the exact same thing in the virtual world. As far as controls go, it doesn’t get more intuitive than that. For instance, VRcade claims that the tester in the video below has no experience with first person shooters, but she still figures out how to navigate in the virtual world:
The correspondence between movement and virtual output also reduces the risk of motion sickness, unlike when you’re experiencing VR while confined to one spot. Finally, and perhaps most importantly, is that extra layer of immersion that users will get from being able to physically feel their movement. Here’s Kelly explaining the basics of VRcade:
VRcade seems really promising: the second coming of the arcade shop, but more inclusive and possibly even healthier. In addition, VRcade can also adapt their system for non-gaming purposes, such as giving virtual tours of structures that have yet to be built.
But of course nothing is perfect. As Ars Technica notes, VRcade has a chicken-or-egg situation when it comes to attracting game developers. Obviously the company needs developers to make games for its system, but because a VRcade game has to be tailored to a particular space, they need to already have that space rented or bought. But how will they know the dimensions of the space a game needs if the game doesn’t exist yet? Hopefully VRcade can figure that out.
[VRcade via Ars Technica]
Microsoft patent applications take Kinect into mobile cameras, movie-making
Microsoft has never been shy about its ambitions for Kinect's depth sensing abilities. A pair of patent applications, however, show that its hopes and dreams are taking a more Hollywood turn. One patent has the depth camera going portable: a "mobile environment sensor" determines its trajectory through a room and generates a depth map as it goes, whether it's using a Kinect-style infrared sensor or stereoscopic cameras. If the visual mapping isn't enough, the would-be camera relies on a motion sensor like an accelerometer to better judge its position as it's jostled around. Microsoft doesn't want to suggest what kind of device (if any) might use the patent for its camera, but it's not ruling out anything from smartphones through to traditional PCs.
The second patent filing uses the Kinect already in the house for that directorial debut you've always been putting off. Hand gestures control the movie editing, but the depth camera both generates a model of the environment and creates 3D props out of real objects. Motion capture, naturally, lets the humans in the scene pursue their own short-lived acting careers. We haven't seen any immediate signs that Microsoft is planning to use this or the mobile sensor patent filing in the real world, although both are closer to reality than some of the flights of fancy that pass by the USPTO -- the movie editor has all the hallmarks of a potential Dashboard update or Kinect Fun Labs project.
Filed under: Digital Cameras, Gaming
Microsoft patent applications take Kinect into mobile cameras, movie-making originally appeared on Engadget on Thu, 02 Aug 2012 18:04:00 EDT. Please see our terms for use of feeds.
Permalink | | Email this | Comments