Get AI summaries of any video or article — Sign up free
How I make science animations thumbnail

How I make science animations

Artem Kirsanov·
6 min read

Based on Artem Kirsanov's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

No single tool completes an entire science animation pipeline; tasks are assigned to the software best suited for them.

Briefing

Science animations are built from a deliberately mixed toolkit: Python (for mathematically generated visuals), Blender (for true 3D), and Adobe After Effects (for composition, timing, and the final polish). The core workflow hinges on a practical rule—no single tool produces every piece cleanly—so the production pipeline assigns tasks to whichever software handles them best, then stitches the results together with tight synchronization.

After Effects is the “workhorse” for most simple animation and for assembling outputs from other programs into a final render. It’s favored because it hits a balance between capability and usability for tasks like styling text, animating elements onto screen, adding motion (wiggles, hue shifts), and transitioning scenes. More specialized effects—realistic explosions, keying, advanced color correction, 3D tracking—are left to dedicated tools when needed.

Python becomes essential when the animation can’t be done by hand or by After Effects scripting alone. After Effects’ JavaScript API is described as limited for complex, programmatic visualization tasks such as generating large numbers of connected elements that obey mathematical rules. Python’s strength is numerical control—variables, loops, recursion—and the ability to generate frames from math directly. Two Python libraries dominate this stage: Manim (originally developed by Grant Sanderson and now community-maintained) for earlier mathematical animation work, and matplotlib for more flexible, frame-by-frame control. Matplotlib is used for animations of plots being drawn and for visualizing complex systems (including examples like an Ising model and artificial neural networks). Manim still earns a place for graph animations thanks to its out-of-the-box graph-theory support.

Blender is the go-to for 3D work because After Effects and Python are described as limited for full three-dimensional scenes. Blender’s Python API also enables programmatic 3D visualization, which matters for neuron-like objects, maze-running mice, and surface plots.

Around this backbone sits a supporting cast from Adobe: Illustrator for vector assets and diagrams that update cleanly inside After Effects; Photoshop for thumbnails and raster touch-ups like subject cutouts and minor color work; Premiere Pro for classic editing tasks like trimming, sequencing, and audio integration; and Adobe Audition for audio cleanup and enhancement (noise reduction, removing plosives, and reducing unwanted mouth sounds).

A detailed example shows how matplotlib and After Effects work together. A rotating arrow produces a sinusoid with time-varying frequency; matplotlib generates the wave and its frequency-related visuals as pre-rendered videos with thousands of frames. After Effects then handles layout, text, and synchronization to voice-over using time remapping. Black backgrounds from matplotlib are converted into transparency via blending modes (notably “Screen”), enabling layers to be arranged without obstruction. Gradient fills are created in After Effects by using gradient ramps and masking techniques to recolor a white fill only where needed. The “squishing” and morphing steps rely on seamless stitching: the final frame of one matplotlib render matches the initial frame of the next, so After Effects can transition between them while time remapping controls pacing.

The same pipeline logic extends to more ambitious projects: biophysically accurate neuron voltage propagation built via a Blender add-on (“Blender Spike”) that imports simulation results into Blender; probability-distribution slicing created by generating height maps with matplotlib and displacing Blender grids, then using node-based shader mixing to reveal thin cut lines; importing brain atlas meshes via Brain Globe API and a helper add-on (“BlenderBrain”); and a branching-model information transmission animation where network states are simulated in Python, then smoothed using convolution (SciPy) to avoid choppy frame-by-frame switching. Even rearranging neuron layouts is handled by combining simulation data with Manim graph objects and coordinated animations.

Across all these examples, the takeaway is operational: generate the math-driven building blocks programmatically, over-render for flexible timing, and let After Effects orchestrate composition and synchronization—often with time remapping as the key control lever.

Cornell Notes

The workflow for science animations is built on tool specialization rather than a single “all-in-one” program. Python generates mathematically defined visuals (often with matplotlib for fine frame control), Blender renders true 3D scenes using its Python API, and After Effects composes everything, synchronizes timing to voice, and applies final styling. A central technique is exporting matplotlib animations with very high frame counts, then using After Effects time remapping to speed up, slow down, or ease the motion without rerendering. Black-background matplotlib renders are made usable in After Effects by converting black to transparency with blending modes, enabling layered composition and gradient recoloring. The same approach scales from wave plots to neuron simulations, probability-surface slicing, brain atlas imports, and branching-network activity.

Why doesn’t one software handle every step of science animation production?

The process is split because each tool has strengths and hard limits. After Effects is strong for composing layers, animating text and UI-like elements, and controlling timing, but it’s weak for complex programmatic visualization. Python is ideal for math-driven generation—numerical arrays, loops, and frame-by-frame control—but it’s limited for full 3D rendering and for classic video editing. Blender fills the 3D gap and can also be driven by Python, while Adobe Premiere Pro and Adobe Audition cover editing and audio cleanup that After Effects isn’t built for.

What makes matplotlib a better fit than Manim for many plot-based animations?

Matplotlib is described as more intuitive for the author and offers more direct control over individual elements on a frame-by-frame basis, even if it’s lower-level and requires more code. It supports animation primitives like FuncAnimation, enabling effects such as drawing a gradient-colored line progressively by changing per-segment opacity (alpha) over time. Manim remains useful for graph animations because it provides an out-of-the-box graph object from NetworkX.

How does the “matplotlib + After Effects” pipeline work in practice?

Matplotlib generates the mathematical building blocks as pre-rendered videos (often with thousands of frames). After Effects then scales and positions layers, converts black backgrounds to transparency using blending modes like “Screen,” and synchronizes playback using time remapping. Gradient fills are created in After Effects by applying gradient ramps and masking the colored result to the original white regions. Seamless morphs are achieved by stitching renders where the last frame of one animation matches the first frame of the next.

Why over-render with a huge number of frames in Python/matplotlib?

High frame counts make timing adjustments easier. If the final animation needs to play faster, After Effects can drop frames (e.g., take every 5th frame) and the motion still looks smooth. Slowing down is harder if frames are scarce; with many pre-rendered frames, slowing can be done by repeating frames, which produces acceptable motion rather than choppy artifacts. The tradeoff is longer files, but rerendering is avoided.

How is neuron voltage propagation animated using simulation data?

A Blender add-on called Blender Spike imports biophysical simulation results into Blender. Simulations run in a neuron simulator environment using real neuron morphologies, then export results into a Blender-friendly .pickle file containing branch morphology and frame-by-frame voltage data. Blender Spike loads the .pickle, allowing customization of appearance (color maps, glow intensity). For more complex compositions, matplotlib voltage graphs can be animated from the same stored Python arrays and then combined in After Effects.

How are probability distribution “slices” created inside Blender?

A height map is generated in Python (using matplotlib) as a black-and-white image where brightness encodes elevation. Blender then uses a grid with a Displace modifier driven by that height map to form a 3D surface. Node-based shader mixing controls slicing: a mix shader blends a gradient shader with transparency based on a driving object’s coordinate (e.g., an empty plane’s Y position thresholded into a factor). A third shader provides a white glow line at the cut boundary, and shader mixing is arranged to work around Blender’s limitation on mixing three shaders simultaneously.

Review Questions

  1. What specific tasks are best handled by After Effects versus Python versus Blender in this workflow?
  2. Describe how time remapping and high frame counts enable smooth speed changes without rerendering.
  3. In the wave example, how do blending modes and masking make matplotlib renders compositable in After Effects?

Key Points

  1. 1

    No single tool completes an entire science animation pipeline; tasks are assigned to the software best suited for them.

  2. 2

    After Effects is used for composition, layer animation, and final synchronization, while Python generates math-driven visuals programmatically.

  3. 3

    Matplotlib is favored for plot-based animations because it allows detailed frame-by-frame control (e.g., progressive drawing via per-segment alpha).

  4. 4

    Blender is used for true 3D scenes and can also be automated through its Python API.

  5. 5

    Over-rendering with thousands of frames makes later speed changes practical via frame dropping or frame repetition in After Effects.

  6. 6

    Black-background matplotlib outputs are made compositable in After Effects by converting black to transparency using blending modes like “Screen.”

  7. 7

    Complex animations (neurons, probability surfaces, brain atlases, branching networks) reuse the same core idea: generate data-driven visuals in code, then orchestrate them in After Effects.

Highlights

After Effects’ time remapping is treated as the main timing control, made effective by exporting matplotlib animations with very large frame counts.
Matplotlib-to-After-Effects compositing relies on turning black backgrounds into transparency (via “Screen”) so layers can overlap cleanly.
A seamless morph is achieved by ensuring the last frame of one matplotlib render matches the first frame of the next, letting After Effects stitch them without visible jumps.
Neuron voltage animations are driven by importing simulation results from a .pickle file through the Blender Spike add-on, enabling both Blender rendering and matplotlib graph overlays.
Probability-surface slicing is implemented with Blender node shaders, where an invisible cutting plane drives a shader mix factor to reveal a thin glowing boundary.

Topics

  • Animation Workflow
  • Matplotlib Animation
  • After Effects Compositing
  • Blender 3D Automation
  • Neuron Visualization

Mentioned