Motion Transfer
The Motion Transfer node (internally called “Animation”) transfers motion from a reference video onto a character image. The character in the image mimics the movements from the reference video.Inputs
| Handle ID | Data Type | Label |
|---|---|---|
image-in | Image | Character |
video-in | Video | Reference |
Outputs
| Handle ID | Data Type | Label |
|---|---|---|
video-out | Video | Video |
Available engines
| Engine ID | Label | Cost (per second) |
|---|---|---|
kling-2.6-motion | Kling 2.6 Motion Control | 0.12 (pro) |
fal-dreamactor-v2 | DreamActor v2 | $0.09 |
How it works
- Provide a character image — the person or character to animate
- Provide a reference video — the motion source (e.g., a dancing TikTok, a person talking)
- The AI maps the reference motion onto the character
- The output is a video of your character performing the reference motion
Advanced options
Kling 2.6 Motion Control
Kling 2.6 Motion Control
| Parameter | Options | Default |
|---|---|---|
| Tier | Standard / Pro | Standard |
| Seed | Any integer | Random |
DreamActor v2
DreamActor v2
No advanced options — uses a flat per-second rate.
Credit cost
| Engine | 5s Standard | 5s Pro |
|---|---|---|
| Kling 2.6 Motion | 4 credits | 6 credits |
| DreamActor v2 | 5 credits | — |
Tips
- Use a clear, front-facing character image for best results
- The reference video should have clear, visible body movements
- DreamActor v2 specializes in facial animation and body movement
- Combine with TikTok Scraper to use trending motions as references
- Use Fixed Context Media to keep the same character across multiple videos
Example use cases
- Making an AI-generated character dance to a TikTok trend
- Animating a brand mascot with natural body movements
- Creating talking-head videos with AI characters