Categories: Image

Time-to-Move + Wan 2.2 Test

Made this using mickmumpitz’s ComfyUI workflow that lets you animate movement by manually shifting objects or images in the scene. I tested both my higher quality camera and my iPhone, and for this demo I chose the lower quality footage with imperfect lighting. That roughness made it feel more grounded, almost like the movement was captured naturally in real life. I might do another version with higher quality footage later, just to try a different approach. Here’s mickmumpitz’s tutorial if anyone is interested: https://youtu.be/pUb58eAZ3pc?si=EEcF3XPBRyXPH1BX

submitted by /u/enigmatic_e
[link] [comments]

AI Generated Robotic Content

Share
Published by
AI Generated Robotic Content
Tags: ai images

Recent Posts

Future of AI image generators

Listen. I honestly don’t know whether this is just coincidence, a deliberate decision, or simply…

18 hours ago

Implementing Prompt Compression to Reduce Agentic Loop Costs

Agentic loops in production can be synonymous with high costs, especially when it comes to…

18 hours ago

Building web search-enabled agents with Strands and Exa

This post is co written by Ishan Goswami and Nitya Sridhar from Exa. If you…

18 hours ago

Cloud Storage Rapid: Turbocharged object storage for AI and analytics

At Google Cloud Next ’26 we announced Cloud Storage Rapid, a family of object storage…

18 hours ago

Ilya Sutskever Stands by His Role in Sam Altman’s OpenAI Ouster: ‘I Didn’t Want It to Be Destroyed’

The former OpenAI chief scientist may be estranged from the company, but he still came…

19 hours ago

People struggle to recall whether content came from AI, with labels forgotten after one week

From August 2026, an EU-wide AI regulation will come into force requiring the labeling of…

19 hours ago