Previous | Next --- Slide 21 of 53
Back to Lecture Thumbnails
lorinpoo

Just wanted to make the note that "Portrait Mode" offered in iPhones is not a natural effect of the camera's focus on the closer object (as shown here, taken from a DSLR). It actually employs edge detection technology to manually blur the background!

sweyns

As a side note, there is a camera technique called deep focus that would capture all parts of the image equally sharply. Given what we've seen, I wonder how this effect of a big depth of field is achieved by a camera.

unicorn

Deep focus is achieved usually by using a smaller aperture (i.e. the light enters the lens through a smaller hole). The picture above is probably using an aperture of something like f/2 to f/4 (just a rough guess based on experience but the blurred background effect will vary based on distances and lens used). To focus on objects that are a variable distance from the camera requires something in the range of f/10 to f/20.

Deep focus can also be achieved by compositing two images taken together taken at different apertures.

ntm

Google Research has published on their synthetic depth-of-field methods in case anyone is interested! They combine person-segmentation with dense depth mapping to power Portrait Mode on the Pixel.

prathikn

A cool article on iPhone XR's ability to capture depth ad replicate this bokeh effect on a single camera: https://blog.halide.cam/iphone-xr-a-deep-dive-into-depth-47d36ae69a81?gi=bc610a121418