Introduction to custom shaders in Three.js

by Mateo Martinjak, February 24, 2023
ThreeJS is an open-source JavaScript library used for creating interactive 3D graphics in a web browser. With ThreeJS, developers can easily create complex 3D scenes, add lighting and shading effects, apply textures and materials to objects, and animation to their creations.


ThreeJS is built on top of WebGL, which is a low-level API for rendering 3D graphics in web browsers. However, ThreeJS abstracts away much of the complexity of working with WebGL, making it easier for developers to create 3D graphics without having to deal with the low-level details of WebGL programming.

WebGL: backbone of ThreeJS


WebGL is a JavaScript API that allows developers to create and manipulate 3D graphics in web browsers. WebGL allows for high-performance 3D graphics rendering in real-time.


One of the key components of WebGL is the use of shaders. Shaders are small programs that run on the GPU and are responsible for calculating the appearance of 3D objects in a scene. There are two main types of shaders in WebGL: vertex shaders and fragment shaders. Shaders run in parallel, so copies of vertex shader work in parallel for each point in the object, and then copies of fragment shader work in parallel for each fragment of the object.


Image describes how a Javascript app sends data to the first shader, which then propagates transformed data to fragment shader. When it finishes, final color is calculated


Vertex shader

Vertex shaders are responsible for transforming the vertices of 3D objects in a scene into their final positions on the screen. They do this by applying a series of mathematical transformations to the vertices, such as scaling, rotating, and translating. The output of the vertex shader is a set of transformed vertices that are ready for further processing.

  • Input: points in the scene, received from JS application

  • Output: transformed points, propagated to fragment shader


Fragment shader

Fragment shaders, on the other hand, are responsible for determining the color and other visual properties of each pixel in the final rendered image. They take as input the transformed vertices generated by the vertex shader and calculate the color and other properties of each pixel based on various lighting and shading effects. Fragment shaders can also apply textures and other materials to objects, allowing for highly detailed and realistic 3D graphics.

  • Input: points in the scene, received from vertex shader

  • Output: final computed color for that point


Together, the vertex and fragment shaders work together to create the final rendered image of a 3D scene. By using shaders, WebGL is able to perform complex calculations in parallel on the GPU, allowing for high-performance rendering of 3D graphics in real-time.


Types of shader variables

Most of variables inside shaders are float or vec3, their use is pretty intuitive.


But it’s important to note two variable keywords, uniform and varying.


uniform defines a variable that is the same across shaders, it is defined in js app itself. For example, if we want to pass a time variable to a shader, or a camera position, we will pass it using this variable.


Communication between vertex and fragment shader is done in the background. But if you want to send some other information from one shader to another, you can use varying variables. For example, if you want to change vertex color, depending on its coordinates in the scene.

Creating a React App

  • Create an app yarn create-react-app threejs-app

  • import threejs yarn add three

  • import react-three-fiber: a React renderer for threejs yarn add @react-three/fiber

Defining a Plane component


Create a Plane component that returns a mesh object. For now we use a basic material meshStandardMaterial that has a uniform color. Later we will replace it with shader-generated material.

type PlaneProps = {
  position: [number, number, number],
  rotation?: [number, number, number],
  scale: number,
  subdivisions: number,
}

function Plane(props: PlaneProps) {
  const ref: any = useRef();

  return (
    <mesh
      {...props}
      ref={ref}
      onPointerOut={(event) => {}}>
      <planeGeometry args={[props.scale, props.scale, props.subdivisions,props.subdivisions]} />
      <meshStandardMaterial color='orange' />
    </mesh>
  )
}


This part is intuitive, a Plane object should have position and rotation defined since mesh requires it. Also, we’re holding a ref object of the component for later use.

Creating a main App component

function App() {
  return (
    <div className="App">
      <Canvas>
        <OrbitControls />
        <ambientLight />
        <Plane 
          position={[0, 0, 0]} 
          rotation={[-Math.PI/2, 0, 0]}  
          scale={4}
          subdivisions={32}
        />
      </Canvas>
    </div>
  );
}


Here it is important to note that we use OrbitControls for basic mouse movement, and also, since the default rotation of a Plane is shifted on x-axis, we rotate it by 90 degrees.

Defining shaders


Shaders in the app are defined using string variables. Minimal vertex shader implementation looks like this:

const vertexShader = () => {
  return `
    void main() {
      gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
    }`
}


What projectionMatrix and modelViewMatrix exactly do goes outside of the scope of this article. They are used to convert the object from its ‘local’ coordinates to ‘global’ coordinate system, since every object is defined in its local coordinates for simplicity. the 4th dimension in gl_Position is used for multiplying with matrices, it’s not used outside of that. Minimal implementation for a fragment shader is:

  const fragmentShader = () => {
    return `
      void main() {
        gl_FragColor = vec4(0.5,0.5,0.5,1.0);
      }`
  }


It just returns the same color for each fragment for now. Again, the 4th dimension is not used. Thats it, to create a final non standard material with shaders, we use ShaderMaterial:

const myShader = new THREE.ShaderMaterial( {
  uniforms: {},
  fragmentShader: fragmentShader(),
  vertexShader: vertexShader()
} );


But for it to work, we need to change the code a little bit. App not sends a new prop myShader to the Plane:

<Plane 
  position={[0, 0, 0]} 
  rotation={[-Math.PI/2, 0, 0]}  
  material={myShader} 
/>


And Plane no longer uses meshStandardMaterial, we use shader-generated material:

// Remove line <meshStandardMaterial color='orange' />
<primitive 
  object={props.material} 
  attach="material" 
/> 
 


Nothing changed
, since shaders don’t do anything for now, lets change that:


First, lets introduce couple of uniform variables:

var uniforms = {
  colorDarkBlue: { type: 'vec3', value: new THREE.Color( 0x0000FF ) },
  colorLightBlue: { type: 'vec3', value: new THREE.Color( 0xFFFFFF ) },
  uTime: { type: 'f', value: 0.0 },   
};


const myShader = new THREE.ShaderMaterial( {
  uniforms: uniforms,
  fragmentShader: fragmentShader(),
  vertexShader: vertexShader()
} );


Since we use uTime to give shaders reference of time, we need to update this variable every so often: So in Plane.js, we use useFrame function to update uTime,by adding to it some small delta time value, through ref we defined for that Plane:

  useFrame((state, delta) => {
        if(ref.current.material.uniforms){
          ref.current.material.uniforms.uTime.value += delta;
      }
    })


We are ready to create final shaders:

  const vertexShader = () => {
    return `
      uniform float uTime;
      varying vec3 vPosition;

      void main() {

        float wave = sin(2.0 * (uTime + position.x +  position.y)) / 4.0;

        gl_Position = projectionMatrix *  modelViewMatrix * 
        vec4( position.x, position.y,  position.z + wave, 1.0 );

        vPosition = vec3(position.x, position.y,  position.z + wave + displacement );
      }
    `
  }


We are creating a wave animation on this Plane, by varying position in the z axis. To do this, variable uTime was needed from outside. Additionally, fragment shaders will also need newly generated points so we can send them these coordinates through varying vec3 vPosition.


Our fragment shader looks like this:

  const fragmentShader = () => {
    return `
      uniform vec3 colorDarkBlue;
      uniform vec3 colorLightBlue;
      uniform float uTime;

      void main() {
        gl_FragColor = vec4(mix(colorDarkBlue, colorLightBlue,(vPosition.z + 0.5)/2.0 ),1.0);
      }`
  }


Since we want to show a slightly different color, depending on Plane position in z-direction, we use mix() that gives us some color variant between colorDarkBlue and colorLightBlue,depending on that position.

Making it more realistic


Since animation still feels monotone, the best way to continue is to define a water texture. Textures are used for these types of materials to give them imperfections, therefore a more realistic feel. Texture is just an image that we use in a vertex shader to make small imperfections and displacements in the animation, and additionally, we use it in fragment shader to mix it in the final color. It’s also necessary to add light interaction. We didn’t use light in these shaders, but light can be added through uniform variables, so that fragment shader can render more realistic lighting on the model.


Here is a Official ThreeJS example using these textures and advanced light interactions:


References


About the author:
Author avatar
Mateo Martinjak
Junior frontend developer
Need help with Introduction to custom shaders in Three.js? Contact us!