This unity project shows a lightfield demo based on Unity using compute shader. For better rendering result, the project is built in HDRP and uses the default scene in the HDRP template.
The demo is made to work in the editor mode.
- A
LightFieldGameObject is for generating a light field dataset. It has 2 planes for ray parametrization. The relative orientations and positions of the 2 planes are maintained. To adjust the position of the wholeLightField, drag theSTplane so that theUVcan follow. - It will create a camera array on the
UVplane when enabled. - Click
DoRenderwill create the dataset to aTexture2DArray.HasRenderResultsindicates the novel views can be generated.
- A
NovelViewSynthesisGameObject is a simulated camera in the scene. It uses the default plane mesh in Unity, which provides the image plane of a camera. A compute shader uses the pixel locations and a novel view camera position to sample in the light field dataset and outputs a render texture. This texture then becomes the main texture of the material of the plane to accomplish real-time novel view synthesis. - There are two ways to sample the dataset (U, V, S, T):
- nearest
- quadrilinear interpolation
- It listens actions in
LightFieldso that ideally, when render results are ready, the compute shader will start to run. If not, just make sure theLightFieldhas render result, re-enable this script and checkDoRendermanually. - Then the plane can be rotated/moved and the camera center can be moved so check novel views.
SaveRenderTexturewill save the novel view as images.
- Unity-2022.3.48f1
- rendering pipeline: HDRP




