Philip Rideout, 18 July 2018
Physics textbooks often have beautiful hand-drawn illustrations. For example, here's something from The Feynman Lectures on Physics (1964):
How can we generate images like this algorithmically?
David Banks and Greg Turk came up with an interesting method for creating pleasing streamline diagrams in 1996 (1) . They used an iterative process to minimize an energy function that represents the overall “quality” of the image.
In 2005, Mebarki et al introduced a new algorithm called Farthest Point Seed Strategy which generates similarly pleasing diagrams very efficiently (2) . It works by partitioning the space using Delaunay triangulation. Their method was used to generate the following illustration:
While the above diagram is visually pleasing, it does not give a good sense of movement or directionality. This can be addressed by adding arrows:
Some of the arrowheads are deformed due to discontinuities in the source data (more on this later in the post). The above images were created with clumpy, a simple command line tool that I'm hosting on github.
Similar diagrams can be generated more robustly (though much less efficiently) with the streamplot
functionality in matplotlib, which also allows you to vary color and streamline width.
As an alternative to streamlines, matplotlib also supports quiver
diagrams:
Arrows can add clutter to a visualization, whereas motion-blurred particles can convey directionality with very little visual noise. The following image is a screencap of a particle animation with motion blur, again created with clumpy.
Of course, an even better visualization is to simply show the animation. The following video is a 4-second loop, made seamless by a simple algorithm that I'll describe shortly.
Shrinking the particle radius and increasing the count creates a fairly pleasing result. I think this might be the best visualization so far:
The above videos were rendered in less than 1 second on my laptop with clumpy.
When capturing a particle simulation, how can we make a video that seamlessly loops over an n-second interval? When the interval ends, it would be visually jarring to reset all particles to their original positions en masse.
An easy fix is to add an initialization phase that moves each particle along its streamline for a random amount of time before returning it to its starting point. After this initial phase, each particle stores a unique integer “age” so that it knows when to reset. Here's the procedure in pseudocode:
kLoopDuration = 4 seconds
kExpectedFps = 60
kFramesPerLoop = kLoopDuration * kExpectedFps
record_video = False
current_frame = 0
for each particle:
particle.age_offset = random integer in [0, kFramesPerLoop)
particle.current_position = random point sample per Christensen[4] or Bridson[5]
particle.current_age = 0
for each time step:
current_frame++
# Record video only during the second interval.
if current_frame >= kFramesPerLoop:
record_video = True
elif current_frame >= 2 * kFramesPerLoop:
record_video = False
# Perform advection.
for each particle:
particle.current_position += kTimeStep * particle.velocity
particle.current_age++
# In the first interval, reset the particle after a random amount of time.
if current_frame < kFramesPerLoop and current_frame >= particle.age_offset:
particle.current_position = particle.original_position
particle.current_age = 0
particle.age_offset = kFramesPerLoop
# In every subsequent interval, simply reset the particle when it expires.
if current_frame >= kFramesPerLoop and particle.current_age >= kFramesPerLoop:
particle.current_position = particle.original_position
particle.current_age = 0
clumpy uses a C++ implementation of this in its
advect_points
command.
For a quick diversion, I'll explain how I generated the data for the above visualizations.
I started with a C1 smooth field of scalars in [−1,+1]. This can be created by summing up a few octaves of gradient noise:
Next, I needed to transform the scalar field into a vector field. One way to do this is to interpret each scalar as an angle:
x⇒⟨sin xπ,cos xπ⟩
This yields the following vector field. The red channel shows the X component, and the green channel shows Y. I decided not to use this since it has little resemblance to anything physical.
Another strategy for converting scalar noise into vectors is to compute (or recover) the gradient of the noise, then take the 2D perpendicular.
ψ⇒(∂ψ∂y,−∂ψ∂x)
The resulting velocity field has the nice property of behaving like an incompressible fluid. Moreover it behaves reasonably near boundaries if you ramp down the source values according to a distance value (3). Here's the distance field that I used:
This is the result after multiplying distance with noise and computing the perp of the gradient:
One issue with this approach is that the distance field does not have C1 continuity, so the resulting “fluid” has unrealistic sharp angles. The above images were all generated using clumpy.
[1] Image-Guided Streamline Placement by David Banks, Greg Turk. (1996)
[2] Farthest Point Seeding for Placement of Streamlines by Mebarki, Alliez, Devillers. (2005)
[3] Curl-Noise for Procedural Fluid Flow by Bridson, Hourihan, Nordenstam. (2007)
[4] Progressive Multi-Jittered Sample Sequences by Per Christensen, Andrew Kensler, Charlie Kilpatrick. (2018)
[5] Fast Poisson Disk Sampling in Arbitrary Dimensions by Robert Bridson. (2007)