'Neon Seoul' is a 10-minute sci-fi short film produced with Skaper in February 2026. No actors, no camera, no studio — completed with only Skaper and music software. This piece shares the full experience and lessons learned, from concept to final edit.
Planning: Start with the Story
AI video production wasn't the goal from the start. I discovered Skaper while searching for a way to actually make a story I'd been developing for years — a cyberpunk short set in Seoul in 2050. Finishing the story first was enormously helpful throughout production. Knowing exactly what mood and composition each scene needed meant the direction never wavered while writing prompts. AI video production is, in the end, still story-first.
Pre-Production: Scene List and Visual References
I broke the script into 23 scenes and prepared a reference image and keyword notes for each. I gathered color and lighting references from Blade Runner 2049, Ghost in the Shell, and Akira. At this stage I also planned engine allocation — Runway for this scene, Kling for that one. Slow, long camera moves went to Kling; fast action with heavy lighting changes went to Runway Gen-4.
Production: 23 Scenes over 4 Days
I generated all scenes over 4 days, averaging 5–6 scenes per day. Each scene required an average of 4–5 regenerations before final selection. Total credits consumed: approximately 1,200 — about ₩60,000 at the time. More than expected, but producing this level of video traditionally would have cost at minimum millions of won. The hardest scene was the protagonist running through a rain-soaked alley. It took 11 regenerations in Kling Master before the running motion looked natural.
Maintaining Protagonist Consistency with FaceSet
I used the FaceSet node extensively to keep the protagonist's face consistent. I first generated 3 'ideal protagonist' images with AI, then registered them as FaceSet references. Since these were AI-generated images rather than real people, there were no copyright concerns. I used FaceSet in 17 of the 23 scenes featuring the protagonist — the final result was quite satisfying. Not perfect, but fully convincing enough to read as the same person.
Post-Production: Edit and Music
I imported the generated clips into DaVinci Resolve and assembled them in script order. Some scenes used speed adjustments (slow-motion effects) to heighten atmosphere. The music was an ambient track generated with Suno AI, layered with synthesizer sounds I composed myself. Sound design took longer than expected, but once the music was finished the overall quality of the video jumped noticeably. The final cut ran 9 minutes 42 seconds — almost exactly the 10 minutes planned in the script.
Results and Lessons Learned
'Neon Seoul' reached over 80,000 views on YouTube within one month. Even with the AI-made disclosure, many commented that it 'felt like a real film'. Three takeaways: First, story comes first — no good AI video without a good story. Second, time investment is necessary — AI doesn't produce a finished product instantly; significant time goes into prompt refinement and regeneration. Third, cost efficiency is extraordinary — this was completed on a budget that would have been impossible with traditional production. AI video is the most powerful tool for anyone who has a story to tell.