To view our interactive character demo, load this page in a Chrome/Firefox/Edge/Safari desktop browser. Run demo anyway

Neuron Demo

DeepMotion Neuron

The Digital Character Dojo!

Physics-based character animation is no longer a million dollar tool for AAA studios. With DeepMotion Neuron, anyone can physicalize their own characters and teach them complex motor skills for games, VR, AR, and film.

Learn about the perks of supporting this campaign

Support DeepMotion Neuron

Intelligent Character Simulation is Coming

We are proud to announce the pre-release of Neuron: the Digital Character Dojo for training motion skills to digital characters. Neuron is the first tool for completely procedural, physical character animation.

We’re asking developers and animators to join our pre-launch and help bring Motion Intelligence to a wider audience. Leveraging decades of experience in game development, animation, and machine learning, we’ve built an incredible suite of tools that enables natural, physics-based animation. Now, we’re ready to invite you to test the boundaries of character simulation and join us in initiating a new wave of interactive storytelling.

As an early adopter, you will receive perks and discounts for backing our pre-launch. Learn more about how Neuron works and the benefits of supporting the campaign below.


What is DeepMotion Neuron?

DeepMotion Neuron is a Behaviors-as-a-Service Platform that offers:

Once you buy a Behavior, you can use it on as many characters as you like. Retargeting is necessary for training custom characters behaviors, so that the motion is scaled to the character’s physical form; we charge per-minute fee for training time. Our Run-Time Package enables DeepMotion physics in your game or experience.

Each DeepMotion Neuron account comes standard with two free assets:

Anyone will be able to sign up for a free DeepMotion Neuron account. Free accounts come with the Basic Behavior Set: a suit of primitive Behaviors like standing, getting up, turning, and walking, as well as a standard character for off the shelf use, which can perform the Basic Behavior Set. The WebGL demo character is trained on the Basic Set plus additional behaviors.

How It Works: Six Steps

Step 1: Upload Your Character

Take your bipedal character file and upload it to the cloud!

neuron how

Step 2: Add Physics

Configure your character to our simulation rig and we’ll automatically add physical attributes at scale like joints, musculature, bone density, and more.

neuron how

Step 3: Select Your Behaviors

Select behaviors to teach your character interactive motions like backflipping, leaping, combat moves, and more. Upload your own training data or browse our Motion Library to add Behaviors to your Collection.

neuron how

Step 4: Train!

Hit "Train" and we’ll retarget the selected skills to your character rig using our custom algorithm.

neuron how

Step 5: Parameterize

Modify your character’s physical attributes to affect their movement style and strength.

neuron how

Step 6: Download

Export a new control file for your intelligent character and download our SDK for Unity or Unreal to work in your preferred game engine.

neuron how

The Rest is Only Limited By Your Imagination

Use DeepMotion Neuron to include interactive characters in your extended reality experiences, automate tedious animation cycles, or save time prototyping!

neuron how

FAQ

What is the difference between Neuron and other character animation methods?

Interaction and speed. Alternative methods for character animation fall into a few categories: keyframe animation, motion capture, and some procedural techniques like inverse kinematics. These methods are useful for creating crafted canned animations (or in the case of IK, approximating procedural motion), but can be time consuming and costly. Neuron allows for entirely procedurally generated, physically interactive character simulation.

Will DeepMotion make traditional animators obsolete?

Far from it. DeepMotion empowers animators to focus on the details and expressive touches that make animation great, while handling the basics of movement and interaction—which can be quite tedious. Photoshop didn’t eliminate photographers; it gave them tools to create great pictures faster and more easily. We want to empower creatives to take their talent even farther with AI assistance.

What are the applications of DeepMotion Neuron?

Primary use-cases include interactive character simulation in MR, VR, AR, Gaming, as well as character animation in Film, Visual Effects, Crowd Simulation, Rapid Prototyping, 3D Avatars, and Digital Emojis. We are also exploring Neuron use-cases in the realms of Robotics, Medical Visualization, Industrial Training, and more.

What about non-bipedal characters?

While our technology can support training for additional character types like quadrupeds, the first release of Neuron only supports bipedal characters. Users can still leverage the Unity SDK to create multi-legged physically simulated characters (tutorials on how to do this are available through our YouTube Channel and Blog).

What do you mean “Interactive Characters”?

Our “Simulation Rig” imbues the character with physical attributes and constraints. The physicalization each character undergoes simulates the joints and muscles typically found in a real world human. We add torque, limitations to joint rotation (for example, in the elbows and knees), etc. Our simulation rig also optimizes some functional objectives, like maintaining balance, mimicking the way people learn to stay upright. This is the basis for real-time, physical interaction between end-users and characters. For example, a Neuron character will stumble on the impact of a push.

How does Neuron stitch multiple behaviors together?

After training your character the desired motion skills, we do a second round of training to create a “Motion Brain”. Using machine learning, the Motion Brain defines connections between different behaviorials to allow transitional locomotion. These simulated transitions look like blended animations at runtime.

Everyone Says They Use "AI", What Techniques Or Algorithms Are You Using?

We employ a variety of machine learning and deep reinforcement learning techniques, in addition to robotic algorithms for physical character modeling. Our behavior training algorithm is not an open source algorithm and will remain proprietary. However, those interested in the science behind our technology can review the work of our Chief Scientist, Libin Liu. See our blog on his latest paper here.

How efficient/performant are characters trained using DeepMotion?

Neuron character control files are fairly performant. We anticipate users being able to simulate upwards of 10 characters in real-time on PC, and upwards of 2 characters on mobile.

My Characters Have Different Body Types, Can I Use The Same Behavior On Both?

Yes. You will still need to retarget the motions to both characters to ensure the motion is scaled to their bodies. Characters do need to have bodies that are physically possible for training to work as intended.

Can I Create My Own Training Data?

Yes! Upload your own .bvh file to train your character on custom data.

What is Neuron compatible with?

Neuron cloud training is compatible with rigged biped character FBX/GLTF files containing 60+ bones. Users will also get access to our runtime SDKs for Unity and Unreal.

What is your licensing model?

Neuron users will pay a one time fee per behavior license. These behaviors can be used on unlimited characters and across multiple projects, with a nominal cloud training fee charged for retargeting or fine tuning a neuron behavior. However, each account will be restricted to one user. Our runtime package will be free for Neuron Indie users. Neuron Enterprise includes support services and custom training, please contact [email protected] to discuss Enterprise licensing.

What is the timeline for Neuron's BAAS release? When will I have access?

The Neuron BAAS cloud release is slated for Q4 of 2018, backers of this campaign will be invited to join prior to the full launch and will additionally receive an exclusive Unity demo package for early testing. Perks like the free retargeting will be redeemable once early adopters get early access to the BAAS platform.


Back To Top