13 Mar Perception Neuron V2 Review
The Perception Neuron V2 is a system that is designed to provide a professional level of motion capturing in a product that is adaptive and affordable. The system can capture data via a direct connection to a computer, streaming data via Wi-Fi, or storing data onto an SD card. This allows for a variety of options for recording data. The effective range for the recording of data is quite large. I was able to record movement from 20m (65.617ft) with little to no issues.
What’s Included
- 32 Neuron (nodes)
- Straps for Head, Hands, Upper and Lower Body, and Feet
- Two Spare straps in the case of damage to the others
- Hub
- Cord for Data transfer
- Two Anti-MAG Neuron Containers
Prepping for Use
Setup and teardown of the Neuron is a tedious affair. It took me about 15 minutes to put on the suit and 15 minutes to take off the suit without any outside assistance. The straps must be placed onto specific locations on your body otherwise the recording software will become confused. While the Perception Neuron V2 provides spare straps that can be reassigned, it was annoying to move the straps into the correct position. The Perception Neuron nodes are sensitive tools that use accelerometers and magnetic sensors in order to detect movements and motions by a user. This is useful for creating a low-cost solution for motion capturing. The issue is that these nodes are sensitive to magnets and can be ruined if they are too close to devices that create magnetism. It is suggested that users place the nodes inside of a provided case that blocks magnetic waves from scrambling the nodes when not in use. Outside of the amount of time it takes to remove/insert the nodes (even more of a production if all 32 need to be inserted), they must be kept away from electronic devices such as keyboards, mice, and monitors. The nodes are about the size of a penny and fragile. Too much force can cause them to snap or some of the connectors to be damaged. The company was quick to send us nodes to replace the broken ones.
Putting Perception Neuron V2 to the Test
The quality of the recordings was rather high. I was surprised by how well it recorded my actions. Pushing the recorded animations into Unity was an easy task. While the straps occasionally slid around and moved, we were able to record a full project worth of animations quickly. There were a couple of issues. For instance, when the straps slid it would cause the animation clip to look unnatural and wonky. A walking animation could come out very stiff or broken looking. While the hands were able to track surprisingly well, they could be a bit sensitive at times. There were several animation clips were my finger looked like they were broken.
Over the course of our testing, we scheduled two big shoots. The first was done at the office and the second was at a nearby empty office space. Both sessions taught us a lot about how the system operated and how to use the motions we grabbed from the system.
One of the techniques that helped us for our development was to map out our area and divide it into 1-meter squares. This technique allowed us to properly plan out where and how to move in the space in order to record the proper animations.
Another technique was zeroing out the model before every take. This meant that I had to walk to a set location and calibrate the suit. While a bit tedious at times, this gave us a chance to correct any sliding or movement that might have occurred. At the same time, it also gave the crew a focal point to base animations around.
Final Thoughts
While I think the Perception Neuron V2 is a powerful hardware tool, I do feel that there is room for refinement. The suit tracks general motion well, but subtler movements and actions weren’t tracked very well and need to be adjusted in an outside program. However, with further training, these issues were overcome, and we were able to create great animations. For the cost of this device, what we’re able to do with it was extremely impressive and I look forward to working with this device more in the future.