iPi Soft

Motion Capture for the Masses

newsletter share-facebook share-twitter

Archive for April, 2017

iPi Mocap Aides Researchers At Rush Alzheimer’s Disease Center

Posted on: April 25th, 2017

iPi Soft Markerless Motion Capture Technology Aides Researchers At Rush Alzheimer’s Disease Center

Innovative Motion Capture Solution Helps Quantify And Improve Alzheimer’s                       

Biomechanical Data Collection Strategies  


MOSCOW – April 27, 2017 – Researchers at the Rush Alzheimer’s Disease Center (the RADC), a department at Rush University Medical Center in Chicago, currently focused on several large ongoing studies of aging and dementia that include more than 2,000 individuals, have found a new tool in their battle against Alzheimer’s and other forms of dementia: markerless motion capture.

According to Bob Dawe, PhD, Assistant Professor of Radiology, there is an increasing recognition in the role gait, mobility and physical activity play in influencing the brain health and successful aging and quality of life in older adults. Because many of the seniors in the study are not able or willing to travel to the facility, a team of RADC researchers are using iPi Soft motion capture technology in conjunction with Microsoft Kinect motion sensor devices to go into different communities to meet with participants and conduct annual tests of cognitive ability, health status questionnaires, blood draws and other research with the goal, he says, of better understanding the factors most influential in staving off these diseases.

“Using iPi Motion Capture software paired with Microsoft Kinect we now have a very accurate digital record of participants’ motions as they complete different motor performances, from which we can better quantify myriad facets of gait, balance and mobility,” Dawe says. “We compared some of the metrics extracted with iPi Motion Capture to the gold standard optical tracking system in the motion analysis laboratory and the timing measures are spot on, and the accuracy of estimated joint angles is also very respectable. On top of that, the iPi Soft motion capture system is more user-friendly than the professional motion tracking rig.”

Dawe explained that the RADC first became interested in using the Kinect as a tool to capture aspects of mobility in its older research participants, but soon learned that the device alone would not be substantial enough to yield accurate results.

“iPi Soft’s iPi Recorder and iPi Mocap Studio are very polished products that did not require extensive programming knowledge,” Dawe notes. “I was thrilled when it worked right out of the box in just a few moments, with the installation package taking care of all the necessary components without any of the cryptic error messages I’d become accustomed to with some of the open source packages I’d worked with in the past.”

The RADC testing measures about a dozen performances, such as walking a straight line, rising from a chair, standing with eyes closed for 20 seconds, standing on one foot, etc. Dawe explained that iPi Motion Capture integrated perfectly with their current gait and mobility-testing collection strategies in that it imposed no additional testing burden on participants, such as the need to shed clothing or wear sensor suits.

Additionally, the systems worked intuitively so the RADC researchers were not preoccupied with the computer during the testing session and could therefore maintain their focus on the study participants. Using iPi Motion Capture with Kinect sensors also yielded better spatial and temporal resolution and a wider field of view and combined with its portability and easy set up made it ideal for conducting research, especially when going into the field into participant’s homes with limited open space.

“Adding iPi Soft and Kinect was just a matter of training our researchers and devising a consistent placement strategy for the hardware and figuring out how to handle all the data,” Dawe notes.

”We are particularly excited to know that the renowned Rush Alzheimer’s Disease Center is relying on our markerless motion capture solution to advance meaningful research to help fight those afflicted with Alzheimer’s,” Michael Nikonov, iPi Soft founder and chief technology architect says. “As more medical and research professionals continue to become familiar with the benefits of markerless tracking and visualizing as a professional and reliable alternative for capturing human motion data without the need for expensive facility space, clumsy sensor suits with reflective markers or a team of technicians, we see endless possibilities for its use in solving important medical problems.”

 About iPi Soft:

Launched in 2008 by CEO and Chief Technology Architect Michael Nikonov, iPi Soft, LLC is the Moscow-based developer of iPi Motion Capture™, a markerless motion capture software tool that uses sophisticated image processing and computer vision algorithms to recognize and track the human body. The company’s breakthrough technology digitizes the movement of a human skeleton, rendering it in expressive 3D characters for video games or computer generated films. For additional information, on iPi Soft, product pricing or a 30-day free trial please visit, http://www.ipisoft.com.

 All trademarks contained herein are the property of their respective owners.

Media Contacts: 

Ambient PR, Vicky Gray-Clark, vicky@ambientpr.com, 408-318-1980

Right Word Media, Ray Ecke, ray@rightwordmedia.com, 973-726-3797


Faster Tracking / Export Enhancements Release

Posted on: April 20th, 2017

Faster Tracking / Export Enhancements Release

New software release includes:

N E W    F E A T U R E S:

  • New fast tracking algorithm for depth sensors (BETA version). To use it, turn on Use fast tracking algorithm (BETA) checkbox on Tracking tab.
    • Tracking speed increase is up to 2.5 times depending on particular hardware and tracking options.
    • Available for single-actor projects, and single-GPU configurations.
Tip: If you use 2 GPUs for tracking, you need to turn it off by unchecking second GPU via Help > Manage GPU Usage menu.
  • Improvements in animation export.
    • New control on Export tab to rotate imported characters into proper orientation. This is useful for many popular characters, including standard Unreal Engine character.
    • The motion is now correctly transferred to imported characters with separate Root bone and Hips/Pelvis bone. You can map hips motion either to Root or to Hips/Pelvis. This is useful for game engine characters, including standard Unity 3D Engine and Unreal Engine characters.
    • Support for FBX format 7.5 (2016)
    • Added finger bones to motion transfer profile for Valve characters in SMD and DMX formats.
Note: Built-in rig for Blender has been removed, because there is no need in change of axes orientation for modern versions of Blender.
  • Allow to select flashlight color for marker detection during calibration (useful in case of light background).
  • Added hints about GPUs to progress bar area during tracking.
  • Showing dialog with common actions available for the exported file after animation export and video export operations.

B U G    F I X E S:

  • Program crashed in case of window resizing during video export.
  • Program crashed on certain errors when opening target character and animations. Now errors are reported.
  • Automation Add-on could hang in rare cases during events sending.
  • Fixed GPU memory leaks on project open / close operations and during calibration.

U N D E R    T H E    H O O D:

  • Improved automatic floor detection both for RGB cameras and depth sensors
  • Replaced viewport repaint on every frame during tracking / refine / jitter removal to repaint by timer. May lead to slight speed increase on powerful GPUs.
  • Improved tracking / refine / jitter removal speed calcualtion and changed formatting for speed values.
  • Using FBX SDK 2017.1 for handling FBX files.
  • Fixed speed degradation issue during calibration using board for depth projects.
  • For multiple GPU configuration the old tracking algorithm is set to be used in case number of cameras is over 6. In most cases this is faster than using default tracking on single GPU.