Particle systems are ubiquitous in computer graphics. They’re used for modeling snow, explosions, sparks, rain, grass, and all sorts of other things that are too hard to model and animate by hand.
A particle system is simply a collection of particles, which are tiny dots that can be made to look like anything, from smoke plumes to soccer balls. Particles are a very important tool in any CG artist’s arsenal, particularly so for animators and effects artists.
They can add realism to a scene or enhance it with other-worldly effects. If you’ve seen a big-budget movie in the past 10 years, odds are at least one explosion or dust storm in it was modeled in a 3D package and included after the scene was shot. Particles make these effects possible and easy.
Particle system renderers are divided into 2D renderers and 3D renderers. The advantage of 2D rendering over 3D rendering is that it’s much faster and certain concerns that would be prevalent in a 3D scene simply don’t exist in 2D rendering, such as 2-sided materials, mesh complexity, etc. 2D renderers are primarily implemented as engines, since their speed is more than enough for real-time applications like scientific visualization or gaming, but there are a handful of 2D particle rendering packages, with particleIllusion being the most prominent example.
3D particle renderers are more common and you can usually find an implementation of a particle system in a package with a 3D renderer, such as Maya, 3ds max, Lightwave, or Softimage. In a 3D context, particles are used to imitate real-world phenomena that requires at least some realism (casting proper shadows or being refracted through a scene object). This sort of thing would be rather challenging with a 2D package, even provided a capable compositing application. Thus, sometimes an artist has to make the tradeoff between the speed of use and the ease of 2D renderers and the realism and versatility of 3D renderers.
Emitters give birth to particles. You’ll find the terms “birth,” “death,” and “life” to be prominent in particle jargon, since they aptly describe the logical structure of a particle animations. Particles are born from an emitter, they are modified in some way (usually animated), they age and then die. An emitter very often specifies some important initial parameters to particles, such as their motion and their lifespan.
The various needs of a CG artist require different types of emitters for different tasks. Generally, the two types of emitters are point emitters and area emitters. Point emitters emit particles from a single point and area emitters emit particles over a specified area. For effects like gun shots or dust trails, point emitters work best, since these types of effects represent a local reaction of an object, such as the firing of a gun or the wheel of a car against a dusty plain. However, point emitters would be quite inadequate for effects like snow storms or waterfalls. This is where area emitters come in. An area emitter emits a particle in every partition that it covers, which is perfect for large areas required for snow storms. What would ordinarily take several point emitters and a lot of modification can easily be done with one area emitter. There are other types of emitters, like 3ds max’s Particle Array, which emits particles from any surface, coming in handy when objects need to be exploded or when dust or other particles need to be lifted off of a surface.
Billboard particles always face the camera. That means that no matter what angle you look at the particle system, the particle’s surface is always oriented towards the viewer. Other parameters, such as rotation, scale, and position, may affect what the surface appears like, but the orientation towards the camera always stays the same with billboard particles. They’re one of the most common particle types where rendering speed is a concern because all they are is a rectangular polygon capable of a transparent texture. A very large amount of billboard particles can be rendered on-screen in real time with any modern GPU.
Billboard particles work best in situations where the “other side” of a particle is unimportant. That is, if a user sees the effect head-on or very far away, where the camera won’t go, it’s much easier to use billboard particles and approximate depth as necessary. Smokestacks, far-away fires, snow, rain, and some varieties of sparks are best done using billboard particles.
The term “voxel” is a portmanteau of “volumetric pixel.” Like pixels on the screen, where tiny dots are arranged along the vertical and horizontal dimensions, voxels are arranged along an additional third dimension. Think of voxels as blocks stacked together, where one voxel may entirely obscure the voxel behind it. This ability is essential in using voxels in particle systems, where transparency and occlusion between particles is important when modeling certain phenomena.
Any effect that requires a high degree of realism, such as clouds, fire, or explosions can be approached with the use of voxels. Packages like Lightwave have voxel rendering built in, but others like 3ds max can be augmented with plug-ins, such as FumeFX. Generally, volumetric particle effects are not included by default in most 3D packages.
The theoretical concepts behind metaballs are beyond the scope of this overview, but they can be generally described as the best high-speed simulation available for liquids and similar substances, such as the “lava” inside lava lamps. The way they work is fairly straightforward. Each particle is a round sphere, until it comes in contact with another particle. At a certain distance apart from each other, two particles will start forming a bridge and eventually join in the shape of a larger particle. This effect is very common in nature, especially in the behavior of liquid drops, which makes metaballs useful in modeling fluids. Metaballs can also be used to create 3D models of objects, especially organic objects like animals or humans, as seen below.
Particle age is an important concept because it addresses what happens to the particle’s appearance as it moves through its lifespan. In real life, a fire has many stages: flames, soot, smoke. Sometimes the smoke is black, sometimes it’s white, sometimes it transitions from one to another due to a substance in the fire. These changes can be modeled using particle age-dependent materials and settings. Emitters usually provide a few age-dependent settings, such as size and speed, but they can’t change whether a particle looks like a flame or like black smoke. That’s where using age-dependent materials comes in. Particles can change not only color, but apparent shape, and they can even be made to look like an animation, such as a man walking. In branding shots and advertisements, where graphics are used for symbolism rather than realism, such versatility can come in very handy.
The particular name for these objects varies from package to package. You’ll see them called “forces” in 3ds max or “fields” in Maya. The general idea behind effectors is that they affect the particle system somehow. Normally, the intent is to simulate real-world constraints on an otherwise ideal system. Therefore, the standard package of effectors includes things like gravity, wind, and a suite of deflectors and attractors. Particle effectors used in combination with proper emitter settings and proper lighting can create very convincing (or alternatively, very fantastic) effects.