Abstract

In this project, we show that we can change the focus of a picture after we take it by using real optical physics instead of fake software blur. Modern phones try to create “portrait mode” effects, but those results are often not physically correct. Our approach uses multiple photos taken from slightly different positions to figure out the directions that light rays came from in the scene.

To collect the photos, we move an iPhone sideways on a slider in small, precise steps. Since we know exactly how far the phone moved for each picture, we can track how objects shift across the images and estimate the angle of each light ray. With this information, we rebuild a 4D light field and use geometric back-propagation in MATLAB to move the virtual focus forward or backward. This lets us refocus on different objects at different distances.

We tested our method both in simulation (using a teapot scene) and on real iPhone photos taken in the library and outdoors at night. In all cases, the algorithm was able to shift the focus to different depths in a physically meaningful way. The results show that accurate, physics-based refocusing is possible using smartphone hardware as long as the camera positions are known.

This system can be expanded in the future by taking photos in a 2D grid, using all three iPhone cameras, improving sampling, and making the process fast enough to use in an iOS demo app.

Document Type

Article

Author's School

McKelvey School of Engineering

Author's Department

Electrical and Systems Engineering

Class Name

Electrical and Systems Engineering Undergraduate Research

Language

English (en)

Date of Submission

12-7-2025

Share

COinS