Physical vs Motivated lighting

One of the most common discussion in lighting pipelines is the bias technology should have towards either Physical or Motivated driven lighting. The challenge is much more complex than it may sound to an untrained ear.

Both solutions have critical advantages. As well as disadvantages.

Based on my experience, motivated lighting is a key factor in both full CG and live-action shots.
Even though an audience or a production may not always relate directly to it, effective driving lighting cues are always a key factor in narration, composition and emotional response, independently on the medium.

As the "Suspension of Disbelief" rule prooves, the human mind is biased to respond to abstract cues to a level beyond physical accuracy. Willing or not, this applies to lighting as well.


Tools have to be built to be strong enough in maintainin realistic continuity and credibility... but with a motivated flair.

---------------------------------------------------------------------------------------------------------------------------------------------------------------

In the article State of the Art from Cinefex issue #100, which sported a fantastic interview to some of the leading minds in the visual effects world, great care was put in highlighting that one of the biggest challenges in lighting cycles was still the ability to give TDs a complete set of efficient ligting tools.


CineFex: As far as digital technology has come, what limitations does it still have? What tools do you wish you had in that toolbox?

JOHN KNOLL: Nothing is ever fast enough. Our render times have remained fairly constant since The Abyss! They're always at the pain threshold. If you look at how long it took to render one frame on The Abyss and what it takes to render a typical frame on the new Star Wars, they are fairly comparable. Of course, that's because we're doing far more complicated renders - denser, wiith more sophisticated shading models. If we had tried to render that back in 1989, it would have taken a month to do what we're doing in an hour now. But people are always trying to figure out a way to make the process faster. Faster is always better.

JOHN DYKSTRA: It's still really tough to do random stuff in the computer. The computer doesn't like to do random stuff; it likes to do organized stuff. So that's a limitation - even when it's done really well. I was astonished at the work in Pearl Harbor. I thought the explosions, fire and smoke were phenomenal in that movie, some of the best-looking procedural stuff I've seen. But it is still really hard to make that kind of random stuff - water, smoke, clouds, fire. Those things are a huge challenge for digital imaging.

ALEX FUNKE: There's also no good digital lighting program. Software designers have never sat down with real cinematographers and said, "OK, let's see how you do this and let's design a program that actually works the way the cinematographer lights." We're way behind on that.

ERIK NASH: Given the complex nature of CG lighting and how non-intuitive and non-interactive it is, I don't know how the artists do what they do. Artists who really understand how digital lighting works do remarkable things with tools that don't mimic real-world lighting tools in any way, shape or form! It's liberating, in that you can do things you could only dream of doing in the physical world; but the non-interactivity of it, to me, would be very frustrating. That's one aspect of it that still seems incredibly arcane - how slow it is to see what it is you're really doing when you're lighting a CG object.

RICHARD HOLLANDER: That's what we need - a way to realistically light environments and get results that don't take 40,000 years to render. It seems to me that if I was to put a synthetic creature sitting next to you in this room, since the lighting is not changing for all of our shots, no matter where my camera is, there shouldn't have to be too much human intervention on the lighting of that creature. We are striving for that. We want to understand this room from a light-sourcing POV - and we want that to happen fast. That's the next layer - to light 90 percent semi automatically, then tweak the rest.



CineFex: It seems ilke lighting, since it is physics-based, ought to be something that could be proceduralized fairly easily.

ERIK NASH: Yes, it can all be quantified. The problem is just the complexity of the calculations, the number of calculations necessary to mimic what happens in the real world. All of that is so staggering, to be able to run that simulation in anything approaching real-time is just currently beyond the realm of possibility. But, as computer power grows, interactivity will improve in leaps and bounds.

HENRY LaBOUNTA: The past couple of years at Siggraph, it seems like more than half the presentations are about real-time issues. The stuff Nvidia is doing with Gelato is brilliant. It's a whole new package they've come out with that is basically intended to do RenderMan-style rendering in real time.

PHIL TIPPETT: That would be ideal - to be able to go in and manipulate a thing with all of the lights on it, in real time. When I was doing stop-motion animation, I found that lighting had everything to do with how you moved the character. And when you're dealing with a fireframe thing, you don't have that. You're just dealing with movement in the abstract until you see the thing rendered out.

RANDY COOK: That's the biggest disadvantage in computer animation, versus stop-motion or some of the older techniques - you don't see what the light is doing as you are animating. On LOTR we tried to get our lighting setups as soon as we could and incorporate them into our work. We also suggested lighting positions, so that our character's face would 'find the light' at the right moment. But most animators today don't get to do that. They're not allowed access to that part of the the toolbox, which is unfortunate.

ROB COLEMAN: I think we're going to have that real-time rendering in the next 3 to 5 years. It may not have every level of specular highlights, or subsurface scattering or all those beautiful things that they do to make realistic skin these days; but it would certainly be good enough for me to judge the animation at a level beyond what we have now.

---------------------------------------------------------------------------------------------------------------------------------------------------------------