Generate Depth and Normal Maps with Blender

Generate Depth and Normal Maps with Blender

December 26, 2021
sellTutorial

Blender is popular open-source 3D computer graphics tool widely used among the developer community, with over 14 million downloads in 2020 alone. Primarily aimed at the 3D modelling and animation industry, such as for games, advertisements and entertainment, it also offers researchers working on 3D reconstruction a powerful tool at their disposal for generation of ground-truth 3D data.

3D reconstruction is an active research field in which we try to recreate the 3D shape of an object or a scene from a 2D image or a sequence of images, such as video frames. There are many ways to represent the 3D shape of an object in a computer, such as the mesh vertices of the object, a depth map labelling each pixel in an object with its distance from the camera, a normal map providing surface normal vectors at each pixel in an image, etc. In this tutorial, we use Blender 2.93.6 to render images of a 3D surface, and save the corresponding depth maps and normal maps as numpy arrays using the Blender Python API.

Without further ado, let’s begin.

Create an empty project in Blender (File > New > General). It should give you something like this in the Layout tab.

Layout tab in Blender showing the 3D scene layout.

The default scene already has a cube, a camera and a light source added to it. In the bottom pane on the right-hand side, find the View Layer Properties and make sure that both the Combined and Z data passes are enabled.

Next, go to the Compositing tab and enable nodes by checking the Use Nodes box. The compositor should show a Render Layer and a Composite node at this point. Add a Viewer node and a Normalize node, and connect everything as follows:

Compositing tab in Blender showing node connections.

The Viewer node shows depth map of the camera view, with depth values normalised between 0 and 1.

Using the Blender Python API, we can access the raw depth values from the Viewer node with the following code:

def get_depth():
    """Obtains depth map from Blender render.
    :return: The depth map of the rendered camera view as a numpy array of size (H,W).
    """
    z = bpy.data.images['Viewer Node']
    w, h = z.size
    dmap = np.array(z.pixels[:], dtype=np.float32) # convert to numpy array
    dmap = np.reshape(dmap, (h, w, 4))[:,:,0]
    dmap = np.rot90(dmap, k=2)
    dmap = np.fliplr(dmap)
    return dmap

The rotation and horizontal flipping is required to align the array values with the rendered image.

Surface normals can be computed from depth values by taking the gradient of the depth map.

def dmap2norm(dmap):
    """Computes surface normals from a depth map.
    :param dmap: A grayscale depth map image as a numpy array of size (H,W).
    :return: The corresponding surface normals map as numpy array of size (H,W,3).
    """
    zx = cv2.Sobel(dmap, cv2.CV_64F, 1, 0, ksize=5)
    zy = cv2.Sobel(dmap, cv2.CV_64F, 0, 1, ksize=5)

    # convert to unit vectors
    normals = np.dstack((-zx, -zy, np.ones_like(dmap)))
    length = np.linalg.norm(normals, axis=2)
    normals[:, :, :] /= length

    # offset and rescale values to be in 0-1
    normals += 1
    normals /= 2
    return normals[:, :, ::-1].astype(np.float32)

Alternatively, surface normals can also be directly generated by Blender itself. For this, you would need to enable the Normal data pass in View Layer Properties, and the connect the Normal output of the Render Layer to a Viewer node in the Compositor. Normal values can then be read similarly to the depth values in the get_depth() function above.

In the Scripting tab, create an empty Python script and add the code snippets above. Everything can be combined like this:

import bpy
import cv2
import numpy as np

bpy.context.scene.render.filepath = "rgb.png"
bpy.ops.render.render(False, animation=False, write_still=True)
dmap = get_depth()
nmap = dmap2norm(dmap)
np.savez_compressed("d.npz", dmap=dmap, nmap=nmap)

When you execute this script, an RGB image of the rendered camera view, as well as the corresponding depth map and surface normals map as a numpy array would be saved on disk.