Bio


C. Karen Liu is a professor in the Computer Science Department at Stanford University. Prior to joining Stanford, Liu was a faculty member at the School of Interactive Computing at Georgia Tech. She received her Ph.D. degree in Computer Science from the University of Washington. Liu's research interests are in computer graphics and robotics, including physics-based animation, character animation, optimal control, reinforcement learning, and computational biomechanics. She developed computational approaches to modeling realistic and natural human movements, learning complex control policies for humanoids and assistive robots, and advancing fundamental numerical simulation and optimal control algorithms. The algorithms and software developed in her lab have fostered interdisciplinary collaboration with researchers in robotics, computer graphics, mechanical engineering, biomechanics, neuroscience, and biology. Liu received a National Science Foundation CAREER Award, an Alfred P. Sloan Fellowship, and was named Young Innovators Under 35 by Technology Review. In 2012, Liu received the ACM SIGGRAPH Significant New Researcher Award for her contribution in the field of computer graphics.

Honors & Awards


  • University of Washington Allen School Alumni Impact Award, University of Washington (2024)
  • ACM SIGGRAPH Academy, ACM (2021)
  • SIGGRAPH Significant New Research Award, ACM (2012)
  • Alfred P. Sloan Research Fellowship, Alfred P. Sloan Foundation (2010)
  • Young Innovators Under 35, MIT Technology Review (2007)
  • CAREER Award, National Science Foundation (2007)

Professional Education


  • BS, National Taiwan University, Computer Science (1999)
  • MS, University of Washington, Computer Science (2001)
  • PhD, University of Washington, Computer Science (2005)

2025-26 Courses


Stanford Advisees


All Publications


  • GaitDynamics: a generative foundation model for analyzing human walking and running. Nature biomedical engineering Tan, T., Van Wouwe, T., Werling, K. F., Liu, C. K., Delp, S. L., Hicks, J. L., Chaudhari, A. S. 2026

    Abstract

    Understanding the dynamics of human gait, including both motions and forces, is vital to promote human mobility. While deep learning models may have advantages over costly laboratory-based experiments and physics-based simulations, existing models have been trained on small datasets with homogeneous demographics and focus on predicting a single output. We developed GaitDynamics, a generative foundation model trained on a large dataset of diverse gait patterns, which allows for flexible inputs, outputs and clinical applications. We illustrate the use of GaitDyanmics for: (1) estimating ground reaction forces from kinematics with high accuracy even with missing kinematic data, (2) predicting the effects of gait modifications on knee loading without resource-intensive experiments and (3) predicting kinematic and force changes that occur with increasing running speeds. Our results demonstrate the accuracy and efficiency of GaitDynamics, showing its potential to assess and optimize gait for injury prevention, disease treatment and performance coaching. All data, code and trained models are publicly shared.

    View details for DOI 10.1038/s41551-025-01565-8

    View details for PubMedID 41491893

  • Learning to Ball: Composing Policies for Long-Horizon Basketball Moves ACM TRANSACTIONS ON GRAPHICS Xu, P., Wu, Z., Wang, R., Sarukkai, V., Fatahalian, K., Karamouzas, I., Zordan, V., Liu, C. 2025; 44 (6)

    View details for DOI 10.1145/3763367

    View details for Web of Science ID 001650565800001

  • AddBiomechanics Dataset: Capturing the Physics of Human Motion at Scale. Computer vision - ECCV ... : ... European Conference on Computer Vision : proceedings. European Conference on Computer Vision Werling, K., Kaneda, J., Tan, T., Agarwal, R., Skov, S., Van Wouwe, T., Uhlrich, S., Bianco, N., Ong, C., Falisse, A., Sapkota, S., Chandra, A., Carter, J., Preatoni, E., Fregly, B., Hicks, J., Delp, S., Liu, C. K. 2025; 15146: 490-508

    Abstract

    While reconstructing human poses in 3D from inexpensive sensors has advanced significantly in recent years, quantifying the dynamics of human motion, including the muscle-generated joint torques and external forces, remains a challenge. Prior attempts to estimate physics from reconstructed human poses have been hampered by a lack of datasets with high-quality pose and force data for a variety of movements. We present the AddBiomechanics Dataset 1.0, which includes physically accurate human dynamics of 273 human subjects, over 70 hours of motion and force plate data, totaling more than 24 million frames. To construct this dataset, novel analytical methods were required, which are also reported here. We propose a benchmark for estimating human dynamics from motion using this dataset, and present several baseline results. The AddBiomechanics Dataset is publicly available at addbiomechanics.org/download_data.html.

    View details for DOI 10.1007/978-3-031-73223-2_27

    View details for PubMedID 40151203

    View details for PubMedCentralID PMC11948690

  • Flying Vines: Design, Modeling, and Control of a Soft Aerial Robotic Arm IEEE ROBOTICS AND AUTOMATION LETTERS Jitosho, R., Winston, C. E., Yang, S., Li, J., Ahlquist, M., Woehrle, N., Liu, C., Okamura, A. M. 2025; 10 (10): 10514-10521
  • Detecting artificially impaired balance in human locomotion: metrics, perturbation effects and detection thresholds. The Journal of experimental biology Wu, J., Raitor, M., Tan, G. R., Staudenmayer, K. L., Delp, S. L., Liu, C. K., Collins, S. H. 2025; 228 (10)

    Abstract

    Measuring balance is important for detecting impairments and developing interventions to prevent falls, but there is no consensus on which method is most effective. Many balance metrics derived from steady-state walking data have been proposed, such as step-width variability, step-time variability, foot placement predictability, maximum Lyapunov exponent and margin of stability. Recently, perturbation-based metrics such as center of mass displacement have also been explored. Perturbations typically involve unexpected disturbances applied to the subject. In this study we collected walking data from 10 healthy human subjects while walking normally and while impairing balance with ankle braces, eye-blocking masks and pneumatic jets on their legs. In some walking trials we also applied mechanical perturbations to the pelvis. We obtained a comprehensive biomechanics dataset and compared the ability of various metrics to detect impaired balance using steady-state walking and perturbation recovery data. We also compared metric performance using thresholds informed by data from multiple subjects versus subject-specific thresholds. We found that step-width variability, step-time variability and foot placement predictability, using steady-state data and subject-specific thresholds, detected impaired balance with the highest accuracy (≥86%), whereas other metrics were less effective (≤68%). Incorporating perturbation data did not improve accuracy of these metrics, although this comparison was limited by the small amount of perturbation data included and analyzed. Subject-specific baseline measurements improved the detection of changes in balance ability. Thus, in clinical practice, taking baseline measurements might improve the detection of impairment due to aging or disease progression.

    View details for DOI 10.1242/jeb.249339

    View details for PubMedID 40403405

  • Generative Motion Infilling from Imprecisely Timed Keyframes COMPUTER GRAPHICS FORUM Goel, P., Zhang, H., Liu, C. K., Fatahalian, K. 2025

    View details for DOI 10.1111/cgf.70060

    View details for Web of Science ID 001469209500001

  • GaitDynamics: A Generative Foundation Model for Analyzing Human Walking and Running. Research square Tan, T., Van Wouwe, T., Werling, K. F., Liu, C. K., Delp, S. L., Hicks, J. L., Chaudhari, A. S. 2025

    Abstract

    Understanding the dynamics of human gait, including both motions and forces, is vital to promote human health and performance. Conventional gait analysis requires laboratory-based experiments and physics-based simulations to quantify gait dynamics and analyze how dynamics change with treatment, training, injury, and disease. However, the high costs associated with experiments and simulations has confined the use of gait dynamics to small-scale research studies. While deep learning models offer low-cost prediction, and can be highly expressive in fitting large-scale data, existing models have primarily been trained on small datasets with homogenous demographics and focused on predicting a single output. To overcome these limitations, we developed GaitDynamics, a generative foundation model for human gait that is trained on a large dataset with diverse participant demographics and gait patterns. GaitDynamics can be used for diverse tasks with different inputs, outputs, and clinical applications, which we illustrate in three examples: i) estimating ground reaction forces from kinematics with high accuracy and robustness even with missing kinematic data and for populations not included in the training dataset, ii) predicting the influence of gait modifications on knee loading without the need for resource-intensive experiments, and iii) predicting kinematic and force changes that occur with increasing running speeds. These representative tasks demonstrate that GaitDynamics makes accurate and rapid predictions in seconds based on flexible inputs, showing its potential to assess and optimize gait for injury prevention, disease treatment, and performance coaching. All data, code, and trained models are publicly shared.

    View details for DOI 10.21203/rs.3.rs-6206222/v1

    View details for PubMedID 40166023

    View details for PubMedCentralID PMC11957236

  • Nymeria: A Massive Collection of Multimodal Egocentric Daily Motion in the Wild Ma, L., Ye, Y., Hong, F., Guzov, V., Jiang, Y., Postyeni, R., Pesqueira, L., Gamino, A., Baiyya, V., Kim, H., Bailey, K., Fosas, D. S., Liu, C., Liu, Z., Engel, J., De Nardi, R., Newcombe, R. edited by Leonardis, A., Ricci, E., Roth, S., Russakovsky, O., Sattler, T., Varol, G. SPRINGER INTERNATIONAL PUBLISHING AG. 2025: 445-465
  • Chain-of-Modality: Learning Manipulation Programs from Multimodal Human Videos with Vision-Language-Models Wang, C., Xia, F., Yu, W., Zhang, T., Zhang, R., Liu, C., Li Fei-Fei, Tan, J., Liang, J. edited by Ott, C. IEEE. 2025: 6527-6535
  • ARCap: Collecting High-quality Human Demonstrations for Robot Learning with Augmented Reality Feedback Chen, S., Wang, C., Nguyen, K., Li Fei-Fei, Liu, C. edited by Ott, C. IEEE. 2025: 8291-8298
  • LookOut: Real-World Humanoid Egocentric Navigation Pan, B., Harley, A., Liu, C., Guibas, L. 2025
  • Human-Object Interaction from Human-Level Instructions Wu, Z., Li, J., Xu, P., Liu, C. 2025
  • PGC: Physics-Based Gaussian Cloth from a Single Pose Guo, M., Chiang, M., Santesteban, I., Sarafianos, N., Chen, H., Halimi, O., Bozic, A., Saito, S., Wu, J., Liu, C., Stuyck, T., Larionov, E., IEEE COMPUTER SOC IEEE COMPUTER SOC. 2025: 21215-21225
  • Generating Detailed Character Motion from Blocking Poses Goel, P., Tevet, G., Liu, C., Fatahalian, K. 2025
  • Robot Trains Robot: Automatic Real-World Policy Adaptation and Learning for Humanoids Hu, K., Shi, H., He, Y., Wang, W., Liu, C. 2025
  • Crossing the Human-Robot Embodiment Gap with Sim-to-Real RL using One Human Demonstration Lum, T., Lee, O., Liu, C., Bohg, J. 2025
  • ToddlerBot: Open-Source ML-Compatible Humanoid Platform for Loco-Manipulation Shi, H., Wang, W., Song, S., Liu, C. 2025
  • HEAD: Hand-Eye Autonomous Delivery: Learning Humanoid Navigation, Locomotion and Reaching Chen, S., Ye, Y., Cao, Z., Lew, J., Xu, P., Liu, C. 2025
  • TWIST: Teleoperated Whole-Body Imitation System Ze, Y., Chen, Z., Araujo, J., Cao, Z., Peng, X., Wu, J., Liu, C. 2025
  • Lifting Motion to the 3D World via 2D Diffusion Li, J., Liu, C., Wu, J., IEEE COMPUTER SOC IEEE COMPUTER SOC. 2025: 17518-17528
  • AddBiomechanics Dataset: Capturing the Physics of Human Motion at Scale Werling, K., Kaneda, J., Tan, T., Agarwal, R., Skov, S., Van Wouwe, T., Uhlrich, S., Bianco, N., Ong, C., Falisse, A., Sapkota, S., Chandra, A., Carter, J., Preatoni, E., Fregly, B., Hicks, J., Delp, S., Liu, C. edited by Leonardis, A., Ricci, E., Roth, S., Russakovsky, O., Sattler, T., Varol, G. SPRINGER INTERNATIONAL PUBLISHING AG. 2025: 490-508
  • Creating a 3D Mesh in A-pose from a Single Image for Character Rigging COMPUTER GRAPHICS FORUM Lee, S., Liu, C. 2024

    View details for DOI 10.1111/cgf.15177

    View details for Web of Science ID 001331442700001

  • State of the Art on Diffusion Models for Visual Computing COMPUTER GRAPHICS FORUM Po, R., Yifan, W., Golyanik, V., Aberman, K., Barron, J. T., Bermano, A., Chan, E., Dekel, T., Holynski, A., Kanazawa, A., Liu, C. K., Liu, L., Mildenhall, B., Niessner, M., Ommer, B., Theobalt, C., Wonka, P., Wetzstein, G. 2024

    View details for DOI 10.1111/cgf.15063

    View details for Web of Science ID 001215986500001

  • A simulation framework to determine optimal strength training and musculoskeletal geometry for sprinting and distance running. PLoS computational biology Van Wouwe, T., Hicks, J., Delp, S., Liu, K. C. 2024; 20 (2): e1011410

    Abstract

    Musculoskeletal geometry and muscle volumes vary widely in the population and are intricately linked to the performance of tasks ranging from walking and running to jumping and sprinting. As an alternative to experimental approaches, where it is difficult to isolate factors and establish causal relationships, simulations can be used to independently vary musculoskeletal geometry and muscle volumes, and develop a fundamental understanding. However, our ability to understand how these parameters affect task performance has been limited due to the high computational cost of modelling the necessary complexity of the musculoskeletal system and solving the requisite multi-dimensional optimization problem. For example, sprinting and running are fundamental to many forms of sport, but past research on the relationships between musculoskeletal geometry, muscle volumes, and running performance has been limited to observational studies, which have not established cause-effect relationships, and simulation studies with simplified representations of musculoskeletal geometry. In this study, we developed a novel musculoskeletal simulator that is differentiable with respect to musculoskeletal geometry and muscle volumes. This simulator enabled us to find the optimal body segment dimensions and optimal distribution of added muscle volume for sprinting and marathon running. Our simulation results replicate experimental observations, such as increased muscle mass in sprinters, as well as a mass in the lower end of the healthy BMI range and a higher leg-length-to-height ratio in marathon runners. The simulations also reveal new relationships, for example showing that hip musculature is vital to both sprinting and marathon running. We found hip flexor and extensor moment arms were maximized to optimize sprint and marathon running performance, and hip muscles the main target when we simulated strength training for sprinters. Our simulation results provide insight to inspire future studies to examine optimal strength training. Our simulator can be extended to other athletic tasks, such as jumping, or to non-athletic applications, such as designing interventions to improve mobility in older adults or individuals with movement disorders.

    View details for DOI 10.1371/journal.pcbi.1011410

    View details for PubMedID 38394308

  • Lower-Limb Exoskeletons Appeal to Both Clinicians and Older Adults, Especially for Fall Prevention and Joint Pain Reduction. IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society Raitor, M., Ruggles, S. W., Delp, S. L., Liu, C. K., Collins, S. H. 2024; 32: 1577-1585

    Abstract

    Exoskeletons are a burgeoning technology with many possible applications to improve human life; focusing the effort of exoskeleton research and development on the most important features is essential for facilitating adoption and maximizing positive societal impact. To identify important focus areas for exoskeleton research and development, we conducted a survey with 154 potential users (older adults) and another survey with 152 clinicians. The surveys were conducted online and to ensure a consistent concept of an exoskeleton across respondents, an image of a hip exoskeleton was shown during exoskeleton-related prompts. The survey responses indicate that both older adults and clinicians are open to using exoskeletons, fall prevention and joint pain reduction are especially important features, and users are likely to wear an exoskeleton in the scenarios when it has the greatest opportunity to help prevent a fall. These findings can help inform future exoskeleton research and guide the development of devices that are accepted, used, and provide meaningful benefit to users.

    View details for DOI 10.1109/TNSRE.2024.3381979

    View details for PubMedID 38536680

  • PDP: Physics-Based Character Animation via Diffusion Policy Truong, T., Piseno, M., Xie, Z., Liu, K. edited by Spencer, S. N. ASSOC COMPUTING MACHINERY. 2024
  • Object-Centric Dexterous Manipulation from Human Motion Data Chen, Y., Wang, C., Yang, Y., Liu, K. edited by Kroemer, O., Agrawal, P., Burgard, W. JMLR-JOURNAL MACHINE LEARNING RESEARCH. 2024
  • Nymeria: A massive collection of multimodal egocentric daily motion in the wild Ma, L., Liu, C., Newcombe, R. 2024
  • AddBiomechanics Dataset: Capturing the Physics of Human Motion at Scale Werling, K., Kaneda, J., Liu, C. 2024
  • DiffusionPoser: Real-time Human Motion Reconstruction From Arbitrary Sparse Sensors Using Autoregressive Diffusion Van Wouwe, T., Lee, S., Falisse, A., Delp, S., Liu, C., IEEE COMPUTER SOC IEEE COMPUTER SOC. 2024: 2513-2523
  • Dexcap: Scalable and portable mocap data collection system for dexterous manipulation Wang, C., Shi, H., Wang, W., Zhang, R., Fei-Fei, L., Liu, C. 2024
  • Behavior-1k: A human-centered, embodied ai benchmark with 1,000 everyday activities and realistic simulation Li, C., Liu, C., Fei-Fei, L. 2024
  • SpringGrasp: Synthesizing Compliant, Dexterous Grasps under Shape Uncertainty Chen, S., Bohg, J., Liu, C. 2024
  • Iterative Motion Editing with Natural Language Goel, P., Wang, K., Liu, C., Fatahalian, K., Spencer, S. ASSOC COMPUTING MACHINERY. 2024
  • Controllable human-object interaction synthesis Li, J., Clegg, A., Mottaghi, ., Wu, J., Puig, X., Liu, C. 2024
  • One-shot transfer of long-horizon extrinsic manipulation through contact retargeting Wu, A., Wang, R., Chen, S., Eppner, C., Liu, C. 2024
  • Object-centric dexterous manipulation from human motion data Chen, Y., Wang, C., Yang, Y., Liu, C. 2024
  • Motion Diffusion-Guided 3D Global HMR from a Dynamic Camera CVPR Heo, J., Wang, ., Liu, C., Yeung-Levy, S. 2024
  • PDP: Physics-based character animation via diffusion policy ACM SIGGRAPH Asia Truong, T., Piseno, M., Xie, Z., Liu, C. 2024
  • FürElise: Capturing and Physically Synthesizing Hand Motion of Piano Performance SIGGRAPH Asia Wang, R., Xu, P., Shi, H., Schumann, E., Liu, C. 2024

    View details for DOI 10.1145/3680528.3687703

  • Object Motion Guided Human Motion Synthesis ACM TRANSACTIONS ON GRAPHICS Li, J., Wu, J., Liu, C. 2023; 42 (6)

    View details for DOI 10.1145/3618333

    View details for Web of Science ID 001139790400025

  • From Skin to Skeleton: Towards Biomechanically Accurate 3D Digital Humans ACM TRANSACTIONS ON GRAPHICS Keller, M., Werling, K., Shin, S., Delp, S., Pujades, S., Liu, C., Black, M. J. 2023; 42 (6)

    View details for DOI 10.1145/3618381

    View details for Web of Science ID 001139790400081

  • AddBiomechanics: Automating model scaling, inverse kinematics, and inverse dynamics from human motion data through sequential optimization. PloS one Werling, K., Bianco, N. A., Raitor, M., Stingel, J., Hicks, J. L., Collins, S. H., Delp, S. L., Liu, C. K. 2023; 18 (11): e0295152

    Abstract

    Creating large-scale public datasets of human motion biomechanics could unlock data-driven breakthroughs in our understanding of human motion, neuromuscular diseases, and assistive devices. However, the manual effort currently required to process motion capture data and quantify the kinematics and dynamics of movement is costly and limits the collection and sharing of large-scale biomechanical datasets. We present a method, called AddBiomechanics, to automate and standardize the quantification of human movement dynamics from motion capture data. We use linear methods followed by a non-convex bilevel optimization to scale the body segments of a musculoskeletal model, register the locations of optical markers placed on an experimental subject to the markers on a musculoskeletal model, and compute body segment kinematics given trajectories of experimental markers during a motion. We then apply a linear method followed by another non-convex optimization to find body segment masses and fine tune kinematics to minimize residual forces given corresponding trajectories of ground reaction forces. The optimization approach requires approximately 3-5 minutes to determine a subject's skeleton dimensions and motion kinematics, and less than 30 minutes of computation to also determine dynamically consistent skeleton inertia properties and fine-tuned kinematics and kinetics, compared with about one day of manual work for a human expert. We used AddBiomechanics to automatically reconstruct joint angle and torque trajectories from previously published multi-activity datasets, achieving close correspondence to expert-calculated values, marker root-mean-square errors less than 2 cm, and residual force magnitudes smaller than 2% of peak external force. Finally, we confirmed that AddBiomechanics accurately reproduced joint kinematics and kinetics from synthetic walking data with low marker error and residual loads. We have published the algorithm as an open source cloud service at AddBiomechanics.org, which is available at no cost and asks that users agree to share processed and de-identified data with the community. As of this writing, hundreds of researchers have used the prototype tool to process and share about ten thousand motion files from about one thousand experimental subjects. Reducing the barriers to processing and sharing high-quality human motion biomechanics data will enable more people to use state-of-the-art biomechanical analysis, do so at lower cost, and share larger and more accurate datasets.

    View details for DOI 10.1371/journal.pone.0295152

    View details for PubMedID 38033114

  • AddBiomechanics: Automating model scaling, inverse kinematics, and inverse dynamics from human motion data through sequential optimization. bioRxiv : the preprint server for biology Werling, K., Bianco, N. A., Raitor, M., Stingel, J., Hicks, J. L., Collins, S. H., Delp, S. L., Liu, C. K. 2023

    Abstract

    Creating large-scale public datasets of human motion biomechanics could unlock data-driven breakthroughs in our understanding of human motion, neuromuscular diseases, and assistive devices. However, the manual effort currently required to process motion capture data and quantify the kinematics and dynamics of movement is costly and limits the collection and sharing of large-scale biomechanical datasets. We present a method, called AddBiomechanics, to automate and standardize the quantification of human movement dynamics from motion capture data. We use linear methods followed by a non-convex bilevel optimization to scale the body segments of a musculoskeletal model, register the locations of optical markers placed on an experimental subject to the markers on a musculoskeletal model, and compute body segment kinematics given trajectories of experimental markers during a motion. We then apply a linear method followed by another non-convex optimization to find body segment masses and fine tune kinematics to minimize residual forces given corresponding trajectories of ground reaction forces. The optimization approach requires approximately 3-5 minutes to determine a subjecťs skeleton dimensions and motion kinematics, and less than 30 minutes of computation to also determine dynamically consistent skeleton inertia properties and fine-tuned kinematics and kinetics, compared with about one day of manual work for a human expert. We used AddBiomechanics to automatically reconstruct joint angle and torque trajectories from previously published multi-activity datasets, achieving close correspondence to expert-calculated values, marker root-mean-square errors less than 2cm, and residual force magnitudes smaller than 2% of peak external force. Finally, we confirmed that AddBiomechanics accurately reproduced joint kinematics and kinetics from synthetic walking data with low marker error and residual loads. We have published the algorithm as an open source cloud service at AddBiomechanics.org, which is available at no cost and asks that users agree to share processed and de-identified data with the community. As of this writing, hundreds of researchers have used the prototype tool to process and share about ten thousand motion files from about one thousand experimental subjects. Reducing the barriers to processing and sharing high-quality human motion biomechanics data will enable more people to use state-of-the-art biomechanical analysis, do so at lower cost, and share larger and more accurate datasets.

    View details for DOI 10.1101/2023.06.15.545116

    View details for PubMedID 37398034

    View details for PubMedCentralID PMC10312696

  • Simulating the effect of ankle plantarflexion and inversion-eversion exoskeleton torques on center of mass kinematics during walking. PLoS computational biology Bianco, N. A., Collins, S. H., Liu, K., Delp, S. L. 2023; 19 (8): e1010712

    Abstract

    Walking balance is central to independent mobility, and falls due to loss of balance are a leading cause of death for people 65 years of age and older. Bipedal gait is typically unstable, but healthy humans use corrective torques to counteract perturbations and stabilize gait. Exoskeleton assistance could benefit people with neuromuscular deficits by providing stabilizing torques at lower-limb joints to replace lost muscle strength and sensorimotor control. However, it is unclear how applied exoskeleton torques translate to changes in walking kinematics. This study used musculoskeletal simulation to investigate how exoskeleton torques applied to the ankle and subtalar joints alter center of mass kinematics during walking. We first created muscle-driven walking simulations using OpenSim Moco by tracking experimental kinematics and ground reaction forces recorded from five healthy adults. We then used forward integration to simulate the effect of exoskeleton torques applied to the ankle and subtalar joints while keeping muscle excitations fixed based on our previous tracking simulation results. Exoskeleton torque lasted for 15% of the gait cycle and was applied between foot-flat and toe-off during the stance phase, and changes in center of mass kinematics were recorded when the torque application ended. We found that changes in center of mass kinematics were dependent on both the type and timing of exoskeleton torques. Plantarflexion torques produced upward and backward changes in velocity of the center of mass in mid-stance and upward and smaller forward velocity changes near toe-off. Eversion and inversion torques primarily produced lateral and medial changes in velocity in mid-stance, respectively. Intrinsic muscle properties reduced kinematic changes from exoskeleton torques. Our results provide mappings between ankle plantarflexion and inversion-eversion torques and changes in center of mass kinematics which can inform designers building exoskeletons aimed at stabilizing balance during walking. Our simulations and software are freely available and allow researchers to explore the effects of applied torques on balance and gait.

    View details for DOI 10.1371/journal.pcbi.1010712

    View details for PubMedID 37549183

  • Hierarchical Planning and Control for Box Loco-Manipulation PROCEEDINGS OF THE ACM ON COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES Xie, Z., Tseng, J., Starke, S., De Panne, M., Liu, C. 2023; 6 (3)

    View details for DOI 10.1145/3606931

    View details for Web of Science ID 001059100600013

  • Anatomically Detailed Simulation of Human Torso ACM TRANSACTIONS ON GRAPHICS Lee, S., Jiang, Y., Liu, C. 2023; 42 (4)

    View details for DOI 10.1145/3592425

    View details for Web of Science ID 001044671300006

  • Trajectory and Sway Prediction Towards Fall Prevention. IEEE International Conference on Robotics and Automation : ICRA : [proceedings]. IEEE International Conference on Robotics and Automation Wang, W., Raitor, M., Collins, S., Liu, C. K., Kennedy, M. 2023; 2023: 10483-10489

    Abstract

    Falls are the leading cause of fatal and non-fatal injuries, particularly for older persons. Imbalance can result from the body's internal causes (illness), or external causes (active or passive perturbation). Active perturbation results from applying an external force to a person, while passive perturbation results from human motion interacting with a static obstacle. This work proposes a metric that allows for the monitoring of the persons torso and its correlation to active and passive perturbations. We show that large changes in the torso sway can be strongly correlated to active perturbations. We also show that we can reasonably predict the future path and expected change in torso sway by conditioning the expected path and torso sway on the past trajectory, torso motion, and the surrounding scene. This could have direct future applications to fall prevention. Results demonstrate that the torso sway is strongly correlated with perturbations. And our model is able to make use of the visual cues presented in the panorama and condition the prediction accordingly.

    View details for DOI 10.1109/icra48891.2023.10161361

    View details for PubMedID 38009123

    View details for PubMedCentralID PMC10671274

  • On Designing a Learning Robot: Improving Morphology for Enhanced Task Performance and Learning Sorokin, M., Fu, C., Tan, J., Liu, K. C., Bai, Y., Lu, W., Ha, S., Khansari, M., IEEE IEEE. 2023: 487-494
  • Benchmarking Rigid Body Contact Models Guo, M., Jiang, Y., Spielberg, A., Wu, J., Liu, K. edited by Pappas, G. J., Matni, N., Morari, M. JMLR-JOURNAL MACHINE LEARNING RESEARCH. 2023
  • Sequential Dexterity: Chaining Dexterous Policies for Long-Horizon Manipulation Chen, Y., Wang, C., Li Fei-Fei, Liu, K. edited by Tan, J., Toussaint, M., Darvish, K. JMLR-JOURNAL MACHINE LEARNING RESEARCH. 2023
  • Reinforcement Learning Enables Real-Time Planning and Control of Agile Maneuvers for Soft Robot Arms Conference on Robot Learning (CoRL) Jitosho, R., Lum, T., Okamura, A., Liu, C. 2023
  • DROP: Dynamics Responses from Human Motion Prior and Projective Dynamics ACM SIGGRAPH Jiang, Y., Won, J., Ye, Y., Liu, C. 2023
  • Synthesizing Dexterous Nonprehensile Pregrasp for Ungraspable Objects Chen, S., Wu, A., Liu, C. edited by Spencer, S. N. ASSOC COMPUTING MACHINERY. 2023
  • EDGE: Editable Dance Generation From Music Tseng, J., Castellon, R., Liu, C., IEEE IEEE COMPUTER SOC. 2023: 448-458
  • <i>NeMo</i>: 3D <i>Ne</i>ural <i>Mo</i>tion Fields from Multiple Video Instances of the Same Action Wang, K., Weng, Z., Xenochristou, M., Araujo, J., Gu, J., Liu, C., Yeung, S., IEEE IEEE COMPUTER SOC. 2023: 22129-22138
  • Ego-Body Pose Estimation via Ego-Head Pose Estimation Li, J., Liu, C., Wu, J., IEEE IEEE COMPUTER SOC. 2023: 17142-17151
  • CIRCLE: Capture In Rich Contextual Environments Araujo, J., Li, J., Vetrivel, K., Agarwal, R., Wu, J., Gopinath, D., Clegg, A., Liu, C., IEEE IEEE COMPUTER SOC. 2023: 21211-21221
  • Sequential Dexterity: Chaining Dexterous Policies for Long-Horizon Manipulation Conference on Robot Learning (CoRL) Chen, Y., Wang, C., Li, F., Liu, C. 2023
  • Characterizing Multidimensional Capacitive Servoing for Physical Human-Robot Interaction IEEE TRANSACTIONS ON ROBOTICS Erickson, Z., Clever, H. M., Gangaram, V., Xing, E., Turk, G., Liu, C., Kemp, C. C. 2022
  • A Survey on Reinforcement Learning Methods in Character Animation Kwiatkowski, A., Alvarado, E., Kalogeiton, V., Liu, C., Pettre, J., van de Panne, M., Cani, M. WILEY. 2022: 613-639

    View details for DOI 10.1111/cgf.14504

    View details for Web of Science ID 000802723900045

  • Learning to Navigate Sidewalks in Outdoor Environments IEEE ROBOTICS AND AUTOMATION LETTERS Sorokin, M., Tan, J., Liu, C., Ha, S. 2022; 7 (2): 3906-3913
  • DCL: Differential Contrastive Learning for Geometry-Aware Depth Synthesis IEEE ROBOTICS AND AUTOMATION LETTERS Shen, Y., Yang, Y., Zheng, Y., Liu, C., Guibas, L. J. 2022; 7 (2): 4845-4852
  • Task-Specific Design Optimization and Fabrication for Inflated-Beam Soft Robots with Growable Discrete Joints IEEE International Conference on Robotics and Automation (ICRA) Exachos, I., Wang, K., Do, B., Stroppa, F., Coad, M., Okamura, A., Liu, C. 2022
  • GIMO: Gaze-Informed Human Motion Prediction in Context Zheng, Y., Yang, Y., Mo, K., Li, J., Yu, T., Liu, Y., Liu, C., Guibas, L. J. edited by Avidan, S., Brostow, G., Cisse, M., Farinella, G. M., Hassner, T. SPRINGER INTERNATIONAL PUBLISHING AG. 2022: 676-694
  • ADeLA: Automatic Dense Labeling with Attention for Viewpoint Adaptation in Semantic Segmentation Conference on Computer Vision and Pattern Recognition (CVPR) Yang, Y., Ren, H., Wang, H., Shen, B., Fan, Q., Zheng, Y., Liu, C., Guibas, L. 2022
  • Data-Augmented Contact Model for Rigid Body Simulation Learning for Dynamics & Control Conference (L4DC) Jian, Y., Sun, J., Liu, C. 2022
  • Learning Diverse and Physically Feasible Dexterous Grasps with Generative Model and Bilevel Optimization Conference on Robot Learning (CoRL) Wu, A., Guo, M., Liu, C. 2022
  • Transformer Inertial Poser: Real-time Human Motion Reconstruction from Sparse IMUs with Simultaneous Terrain Generation Proceedings of SIGGRAPH Asia Jiang, Y., Ye, Y., Gopinath, D., Won, J., Winkler, A., Liu, C. 2022
  • BEHAVIOR-1K: A Benchmark for Embodied AI with 1,000 Everyday Activities and Realistic Simulation Conference on Robot Learning (CoRL) Li, C. 2022
  • Real-time Model Predictive Control and System Identification Using Differentiable Physics Simulation IEEE Robotics and Automation Letters, Chen, S., Werling, K., Wu, A., Liu, C. 2022
  • Scene Synthesis from Human Motion Proceedings of ACM SIGGRAPH Asia Ye, S., Wang, Y., Li, J., Park, D., Liu, C., Xu, H., Wu, J. 2022
  • Learning Human Search Behavior from Egocentric Visual Inputs COMPUTER GRAPHICS FORUM Sorokin, M., Yu, W., Ha, S., Liu, C. 2021; 40 (2): 389-398

    View details for DOI 10.1111/cgf.142641

    View details for Web of Science ID 000657959600032

  • The Role of Physics-Based Simulators in Robotics ANNUAL REVIEW OF CONTROL, ROBOTICS, AND AUTONOMOUS SYSTEMS, VOL 4, 2021 Liu, C., Negrut, D. edited by Leonard, N. E. 2021; 4: 35-58
  • Protective Policy Transfer Yu, W., Turk, G., Liu, C. K. 2021
  • SimGAN: Hybrid Simulator Identification for Domain Adaptation via Adversarial Reinforcement Learning Jiang, Y., Zhang, T., Ho, D., Bai, Y., Liu, C. K., Levine, S., Tan, J. 2021
  • Policy Transfer via Kinematic Domain Randomization and Adaptation Exarchos, I., Jiang, Y., Yu, W., Liu, C. K. 2021
  • Fast and Feature-Complete Differentiable Physics for Articulated Rigid Bodies with Contact Werling, K., Omens, D., Lee, J., Exarchos, I., Liu, C. K. 2021
  • Error-Aware Policy Learning: Zero-Shot Generalization in Partially Observable Dynamic Environments Kumar, V. C., Ha, S., Liu, C. K. 2021
  • Learning Task-Agnostic Action Spaces for Movement Optimization IEEE Transactions on Computer Graphics and Visualization Babadi, A., van de Panne, M., Liu, C. K., Hämäläinen, P. 2021
  • COCOI: Contact-aware Online Context Inference for Generalizable Non-planar Pushing Xu, Z., Yu, W., Herzog, A., Lu, W., Fu, C., Tomizuka, M., Bai, Y., Liu, C. K., Ho, D. 2021
  • iGibson 2.0: Object-Centric Simulation for Robot Learning of Everyday Household Tasks Li, C., Xia, F., Martin-Martin, R., Lingelbach, M., Srivastava, S., Shen, B., Vainio, K., Gokmen, C., Dharan, G., Jain, T., Kurenkov, A., Liu, C. K., Gweon, H., Wu, J., Fei-Fei, L., Savarese, S. 2021
  • Co-GAIL Learning Diverse Strategies for Human-Robot Collaboration Wang, C., Pérez-D'Arpino, C., Xu, D., Fei-Fei, L., Liu, C. K., Savarese, S. 2021
  • BEHAVIOR: Benchmark for Everyday Household Activities in Virtual, Interactive, and Ecological Environments Srivastava, S., Li, C., Lingelbach, M., Martin-Martin, R., Xia, F., Vainio, K., Lian, Z., Gokmen, C., Buch, S., Liu, C. K., Savarese, S., Gweon, H., Wu, J., Fei-Fei, L. 2021
  • DASH: Modularized Human Manipulation Simulation with Vision and Language for Embodied AI Jiang, Y., Guo, M., Li, J., Exarchos, I., Wu, J., Liu, C. K. 2021
  • Learning to Manipulate Amorphous Materials ACM TRANSACTIONS ON GRAPHICS Zhang, Y., Yu, W., Liu, C., Kemp, C., Turk, G. 2020; 39 (6)
  • Learning to Collaborate From Simulation for Robot-Assisted Dressing IEEE ROBOTICS AND AUTOMATION LETTERS Clegg, A., Erickson, Z., Grady, P., Turk, G., Kemp, C. C., Liu, C. 2020; 5 (2): 2746–53
  • Learning a Control Policy for Fall Prevention on an Assistive Walking Device Kumar, V. C., Ha, S., Sawicki, G., Liu, C. K. 2020
  • Assistive Gym: A Physics Simulation Framework for Assistive Robotics Erickson, Z., Gangaram, V., Kapusta, A., Liu, C. K., Kemp, C. C. 2020
  • Visualizing Movement Control Optimization Landscapes. IEEE transactions on visualization and computer graphics Hamalainen, P. n., Toikka, J. n., Babadi, A. n., Liu, K. n. 2020; PP

    Abstract

    A large body of animation research focuses on optimization of movement control, either as action sequences or policy parameters. However, as closed-form expressions of the objective functions are often not available, our understanding of the optimization problems is limited. Building on recent work on analyzing neural network training, we contribute novel visualizations of high-dimensional control optimization landscapes; this yields insights into why control optimization is hard and why common practices like early termination and spline-based action parameterizations make optimization easier. For example, our experiments show how trajectory optimization can become increasingly ill-conditioned with longer trajectories, but parameterizing control as partial target states-e.g., target angles converted to torques using a PD-controller-can act as an efficient preconditioner. Both our visualizations and quantitative empirical data also indicate that neural network policy optimization scales better than trajectory optimization for long planning horizons. Our work advances the understanding of movement optimization and our visualizations should also provide value in educational use.

    View details for DOI 10.1109/TVCG.2020.3018187

    View details for PubMedID 32816675

  • Estimating Mass Distribution of Articulated Objects using Non-prehensile Manipulation Kumar, K. N., Essa, I., Ha, S., Liu, C. K. 2020
  • Bodies at Rest: 3D Human Pose and Shape Estimation from a Pressure Image using Synthetic Data Clever, H. M., Erickson, Z., Kapusta, A., Turk, G., Liu, C., Kemp, C. C., IEEE IEEE. 2020: 6214–23
  • Learning a Control Policy for Fall Prevention on an Assistive Walking Device Kumar, V., Ha, S., Sawicki, G., Liu, C. K. 2020
  • Personalized collaborative plans for robot-assisted dressing via optimization and simulation AUTONOMOUS ROBOTS Kapusta, A., Erickson, Z., Clever, H. M., Yu, W., Liu, C., Turk, G., Kemp, C. C. 2019; 43 (8): 2183–2207
  • Synthesis of Biologically Realistic Human Motion Using Joint Torque Actuation ACM TRANSACTIONS ON GRAPHICS Jiang, Y., Van Wouwe, T., De Groote, F., Liu, C. 2019; 38 (4)
  • Sim-to-Real Transfer for Biped Locomotion IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Yu, W., Kumar, V. C., Turk, G., Liu, C. 2019
  • Policy Transfer with Strategy Optimization Yu, W., Liu, C., Turk, G. 2019
  • Multidimensional Capacitive Sensing for Robot-Assisted Dressing and Bathing Erickson, Z., Clever, H. M., Gangaram, V., Turk, G., Liu, C., Kemp, C. C. 2019