HeyGen Open-Sources HyperFrames: A Remotion Rival, The Era of Writing Video with HTML is Here

HyperFrames demo animation: HTML code on the left renders into video on the right in real-time

HeyGen's newly open-sourced HyperFrames framework addresses a core contradiction in the video production field: professional tools have a steep learning curve, while simple tools lack flexibility. This HTML-based solution directly transforms video into a programmable object.

Why Choose HTML Over React?

Although both HyperFrames and Remotion can achieve programmatic video generation, their design philosophies are vastly different.

Remotion is built on the React architecture, making it more suitable for teams with an existing React tech stack, especially for scenarios requiring the batch generation of a large volume of videos from spreadsheets.

In contrast, HyperFrames excels at rapidly generating single, high-quality videos and works particularly smoothly when collaborating with AI agents. Joshua Xu, an engineer at HeyGen, revealed the fundamental logic behind this choice: Large Language Models (LLMs) are trained on HTML and have accumulated vast knowledge of web code, whereas React combined with Remotion represents a tiny fraction of training data.

This design decision brings significant practical advantages. Developer Misbah Syed used the same prompt to have Claude Opus 4.7 generate videos for both frameworks. HyperFrames completed rendering in just 60 seconds, whereas Remotion not only took 162 seconds but also required an additional 4 minutes for the initial build time. In terms of output size, HyperFrames is also lighter, weighing in at only 4MB, compared to Remotion's typical 14MB. Below is a comparison video:

Core Features and Workflow

The core philosophy of HyperFrames is "one file input, video output." It uses pure HTML tags combined with data attributes to control the timeline, such as data-start="2" data-duration="3", avoiding the overhead of a virtual DOM.

Its AI-first design allows agents to directly generate valid code via the /hyperframes command without struggling with hooks and lifecycle rules as one would when using React.

This design yields tangible benefits in actual usage. For example, a user can say, "Make the title twice as large, switch to dark mode, and add a fade-out effect at the end," and the AI agent can directly understand and execute the request.

Practical Application Scenarios

HyperFrames' applications extend far beyond traditional video production. It can automatically convert CSV data into dynamic chart videos, generate subtitled tutorials using Text-to-Speech (TTS) synthesis, or batch-produce e-commerce product display templates.

It also provides over 50 plug-and-play effect blocks. For instance, using a command like npx hyperframes add instagram-follow allows developers to quickly integrate social media overlays.

Getting Started

Installing HyperFrames is very simple, requiring just one line of command:

npx skills add heygen-com/hyperframes

This command not only installs the framework but also automatically installs relevant skills for your AI agent, enabling it to understand how to use HyperFrames' specific syntax.

Developers who have tried it report that its greatest advantage lies in "replacing professional video software with standard web technology stacks." However, users should note the environmental requirements for Node.js 22+ and FFmpeg. For teams needing to frequently produce standardized videos, this could be a technical path worth watching.

In any case, HyperFrames is a major new tool for "vibe video" following Remotion. The author has also produced some videos using Remotion in their video account and will attempt to use HyperFrames for future productions. Those interested are welcome to follow the video updates.

Project Address: https://github.com/heygen-com/hyperframes

Related Articles

分享網址
AINews·AI 新聞聚合平台
© 2026 AINews. All rights reserved.