How I make science animations
Based on Artem Kirsanov's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
No single tool completes an entire science animation pipeline; tasks are assigned to the software best suited for them.
Briefing
Science animations are built from a deliberately mixed toolkit: Python (for mathematically generated visuals), Blender (for true 3D), and Adobe After Effects (for composition, timing, and the final polish). The core workflow hinges on a practical rule—no single tool produces every piece cleanly—so the production pipeline assigns tasks to whichever software handles them best, then stitches the results together with tight synchronization.
After Effects is the “workhorse” for most simple animation and for assembling outputs from other programs into a final render. It’s favored because it hits a balance between capability and usability for tasks like styling text, animating elements onto screen, adding motion (wiggles, hue shifts), and transitioning scenes. More specialized effects—realistic explosions, keying, advanced color correction, 3D tracking—are left to dedicated tools when needed.
Python becomes essential when the animation can’t be done by hand or by After Effects scripting alone. After Effects’ JavaScript API is described as limited for complex, programmatic visualization tasks such as generating large numbers of connected elements that obey mathematical rules. Python’s strength is numerical control—variables, loops, recursion—and the ability to generate frames from math directly. Two Python libraries dominate this stage: Manim (originally developed by Grant Sanderson and now community-maintained) for earlier mathematical animation work, and matplotlib for more flexible, frame-by-frame control. Matplotlib is used for animations of plots being drawn and for visualizing complex systems (including examples like an Ising model and artificial neural networks). Manim still earns a place for graph animations thanks to its out-of-the-box graph-theory support.
Blender is the go-to for 3D work because After Effects and Python are described as limited for full three-dimensional scenes. Blender’s Python API also enables programmatic 3D visualization, which matters for neuron-like objects, maze-running mice, and surface plots.
Around this backbone sits a supporting cast from Adobe: Illustrator for vector assets and diagrams that update cleanly inside After Effects; Photoshop for thumbnails and raster touch-ups like subject cutouts and minor color work; Premiere Pro for classic editing tasks like trimming, sequencing, and audio integration; and Adobe Audition for audio cleanup and enhancement (noise reduction, removing plosives, and reducing unwanted mouth sounds).
A detailed example shows how matplotlib and After Effects work together. A rotating arrow produces a sinusoid with time-varying frequency; matplotlib generates the wave and its frequency-related visuals as pre-rendered videos with thousands of frames. After Effects then handles layout, text, and synchronization to voice-over using time remapping. Black backgrounds from matplotlib are converted into transparency via blending modes (notably “Screen”), enabling layers to be arranged without obstruction. Gradient fills are created in After Effects by using gradient ramps and masking techniques to recolor a white fill only where needed. The “squishing” and morphing steps rely on seamless stitching: the final frame of one matplotlib render matches the initial frame of the next, so After Effects can transition between them while time remapping controls pacing.
The same pipeline logic extends to more ambitious projects: biophysically accurate neuron voltage propagation built via a Blender add-on (“Blender Spike”) that imports simulation results into Blender; probability-distribution slicing created by generating height maps with matplotlib and displacing Blender grids, then using node-based shader mixing to reveal thin cut lines; importing brain atlas meshes via Brain Globe API and a helper add-on (“BlenderBrain”); and a branching-model information transmission animation where network states are simulated in Python, then smoothed using convolution (SciPy) to avoid choppy frame-by-frame switching. Even rearranging neuron layouts is handled by combining simulation data with Manim graph objects and coordinated animations.
Across all these examples, the takeaway is operational: generate the math-driven building blocks programmatically, over-render for flexible timing, and let After Effects orchestrate composition and synchronization—often with time remapping as the key control lever.
Cornell Notes
The workflow for science animations is built on tool specialization rather than a single “all-in-one” program. Python generates mathematically defined visuals (often with matplotlib for fine frame control), Blender renders true 3D scenes using its Python API, and After Effects composes everything, synchronizes timing to voice, and applies final styling. A central technique is exporting matplotlib animations with very high frame counts, then using After Effects time remapping to speed up, slow down, or ease the motion without rerendering. Black-background matplotlib renders are made usable in After Effects by converting black to transparency with blending modes, enabling layered composition and gradient recoloring. The same approach scales from wave plots to neuron simulations, probability-surface slicing, brain atlas imports, and branching-network activity.
Why doesn’t one software handle every step of science animation production?
What makes matplotlib a better fit than Manim for many plot-based animations?
How does the “matplotlib + After Effects” pipeline work in practice?
Why over-render with a huge number of frames in Python/matplotlib?
How is neuron voltage propagation animated using simulation data?
How are probability distribution “slices” created inside Blender?
Review Questions
- What specific tasks are best handled by After Effects versus Python versus Blender in this workflow?
- Describe how time remapping and high frame counts enable smooth speed changes without rerendering.
- In the wave example, how do blending modes and masking make matplotlib renders compositable in After Effects?
Key Points
- 1
No single tool completes an entire science animation pipeline; tasks are assigned to the software best suited for them.
- 2
After Effects is used for composition, layer animation, and final synchronization, while Python generates math-driven visuals programmatically.
- 3
Matplotlib is favored for plot-based animations because it allows detailed frame-by-frame control (e.g., progressive drawing via per-segment alpha).
- 4
Blender is used for true 3D scenes and can also be automated through its Python API.
- 5
Over-rendering with thousands of frames makes later speed changes practical via frame dropping or frame repetition in After Effects.
- 6
Black-background matplotlib outputs are made compositable in After Effects by converting black to transparency using blending modes like “Screen.”
- 7
Complex animations (neurons, probability surfaces, brain atlases, branching networks) reuse the same core idea: generate data-driven visuals in code, then orchestrate them in After Effects.