Computational photography is the idea that you can compose an image in a computer, not within the camera. Traditionally you take a picture and the image is composed on the film at the back of the camera. Alternativly you use a CCD instead of film. This captures the moment in time and it's later displayed and viewed, either on film or on a compuer. But what if you capture more then the moment?
The movie 'The Matrix' was the first to explore aspects of this using multiple cameras. Each camera was set to take an image from a slightly different position and angle. The cameras could be timed to shoot at the same time or at slighly delayed times. The effect was later composed in a computer to allow the illusion of motion.
What if you took three pictures of a room or a place. In theory you could compute a 3D model of that place. You could then move through that model to find just the right angle. Given enough computer power you could give the director the ability to move a virtual camera anywhere within the shot in real-time. This would be the ultimate instant replay.
How about a simpler example... Today a photographer adjusts the ammount of light the camera lets in and the length of the exposure. Even in digital cameras this can result in over or underexposed pictures. With computational photography you can capture not only the momentary image but the intensity of light across the entire exposure. This would allow you to adjust the lighting of the shot after you had already taken it.