Raw WebGL

Alain Galvan
10 min readApr 8, 2020

While it is possible to render 3D graphics using basic HTML and CSS or with the Canvas API, there’s an alternative Web API for rendering detailed graphics with the same level of visual fidelity as that of video games.

WebGL is an adaptation of the OpenGL ES 2.0 Spec to JavaScript, and started off as a collaboration of the Khronos Group and Mozilla.

It’s been used to showcase procedural art pieces in places like ShaderToy and CodePen, 3D models such as those rendered with Marmoset Viewer, geovisualizations such as those in Uber’s deck.gl, and much more.

WebGL is currently supported on all major browsers as far back as year 2012.

At its core, WebGL is a state machine that lets you as the developer tell it how and where it will draw triangles/points/lines, so it’s your job as a engine developer to organize when and how the state of the application will change.

I’ve prepared a Github repo with everything you need to get started. We’ll walk through writing a WebGL Hello Triangle application in TypeScript (JavaScript with optional type checking).

Check out my other post on OpenGL for writing native C++ applications with nearly the same interface. It’s possible to export OpenGL apps as WebGL via WebAssembly. If you would like to try a more recent API, check out my post on WebGPU, an API that aligns more closely to modern graphics APIs such as Metal/Vulkan/DirectX 12.


First install:

Then type the following in your terminal.

# 🐑 Clone the repo
git clone https://github.com/alaingalvan/webgl-seed
# 💿 go inside the folder
cd webgl-seed
# 🔨 Start building the project
npm start

Refer to this blog post on designing web libraries and apps for more details on Node.js, packages, etc.

Project Layout

As your project becomes more complex, you’ll want to separate files and organize your application to something more akin to a game or renderer, check out this post on game engine architecture and this one on real time renderer architecture for more details.

├─ 📂 node_modules/   # 👶 Dependencies
│ ├─ 📁 gl-matrix # ➕ Linear Algebra
│ └─ 📁 ... # 🕚 Other Dependencies (TypeScript, Webpack, etc.)
├─ 📂 src/ # 🌟 Source Files
│ ├─ 📄 renderer.ts # 🔺 Triangle Renderer
│ └─ 📄 main.ts # 🏁 Application Main
├─ 📄 .gitignore # 👁️ Ignore certain files in git repo
├─ 📄 package.json # 📦 Node Package File
├─ 📄 license.md # ⚖️ Your License (Unlicense)
└─ 📃readme.md # 📖 Read Me!


  • gl-matrix — A JavaScript library that allows users to write glsl like JavaScript code, with types for vectors, matrices, etc. While not in use in this sample, it's incredibly useful for programming more advanced topics such as camera matrices.
  • TypeScript — JavaScript with types, makes it significantly easier to program web apps with instant autocomplete and type checking.
  • Webpack — A JavaScript compilation tool to build minified outputs and test our apps faster.


In this application we will need to do the following:

  1. Initialize the API — Create your HTMLCanvasElement, either directly on your webpage or dynamically. Then call .getContext('webgl') to get a handle to the gl state machine which will write directly to that canvas. Then setup any initial state for the gl state machine, such as enabling depth testing, your clear color, etc.
  2. Initialize Resources — Create your WebGLBuffers for your Vertex, Index data, your Vertex/Fragment WebGLShaders, your WebGLProgram.
  3. Render — Define your vertex layout, bind your WebGLBuffers and WebGLProgram to the state machine, set any uniform data for that draw call, and drawElements.
  4. Destroy — Destroy any WebGL handles that you’ve created once you’re done using them.

The following will explain snippets from that can be found in the Github repo, with certain parts omitted, and member variables (this.memberVariable) declared inline without the this. prefix so their type is easier to see and the examples here can work on their own.

Initialize API

The key to starting a WebGL application is to call the getContext function on an HTMLCanvasElement. You can then supply a specific version of WebGL ('webgl' or 'webgl2') along with an optional config object.

// 👋 Declare handles
let canvas: HTMLCanvasElement = document.getElementById('webgl') as HTMLCanvasElement;
// ⚪ Initialization
let gl: WebGLRenderingContext = canvas.getContext('webgl');
if (!gl) {
// This rendering engine failed to start...
throw new Error('WebGL failed to initialize.');
// Most WebGL Apps will want to enable these settings:// ⚫ Set the default clear color when calling `gl.clear`
gl.clearColor(0.0, 0.0, 0.0, 0.0);
// 🎭 Write to all channels during a clear
gl.colorMask(true, true, true, true)
// 👓 Test if when something is drawn, it's in front of what was drawn previously
// ≤ Use this function to test depth values
// 🌒 Hide triangles who's normals don't face the camera
// 🍥 Properly blend images with alpha channels
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);

Initialize Resources

Vertex Buffer Object

A Vertex Buffer Object (VBO) is a block of memory (such as a Typed Array) containing vertex data.

You could describe this data with one big buffer containing everything or with independent arrays for each element in your vertex layout, whichever best fits your use case and performance requirements.

Having them split can be easier to update if you’re changing your vertex buffer data often, which may be useful for CPU animations or procedurally generated geometry.

// 📈 Position Vertex Buffer Data
const positions = new Float32Array([
1.0, -1.0, 0.0,
-1.0, -1.0, 0.0,
0.0, 1.0, 0.0
// 🎨 Color Vertex Buffer Data
const colors = new Float32Array([
1.0, 0.0, 0.0, // 🔴
0.0, 1.0, 0.0, // 🟢
0.0, 0.0, 1.0 // 🔵
// ✋ Declare Vertex Buffer Handles
let positionBuffer: WebGLBuffer = null;
let colorBuffer: WebGLBuffer = null;
// 👋 Helper function for creating WebGLBuffer(s) out of Typed Arrays
let createBuffer = (arr) => {
// ⚪ Create Buffer
let buf = gl.createBuffer();
let bufType =
arr instanceof Uint16Array || arr instanceof Uint32Array ? gl.ELEMENT_ARRAY_BUFFER : gl.ARRAY_BUFFER;
// 🩹 Bind Buffer to WebGLState
gl.bindBuffer(bufType, buf);
// 💾 Push data to VBO
gl.bufferData(bufType, arr, gl.STATIC_DRAW);
return buf;
// ⚪ Create VBO
positionBuffer = createBuffer(positions);
colorBuffer = createBuffer(colors);

Index Buffer Object

An Index Buffer Object (IBO) is a list of vertex indices that’s used to make triangles, lines, or points. When rendering a set of triangles, Index Buffers allow for the reuse of a given vertex for a different triangle.

Now if you’re rendering triangles, there should be 3 indices per triangle in the index buffer, for lines there should be 2 indices per line, and points refer to just 1 element of an index buffer.

// 🗄️ Index Buffer Data
const indices = new Uint16Array([ 0, 1, 2 ]);
// ✋ Declare Index Buffer Handle
let indexBuffer: WebGLBuffer = null;
// ⚪ Create IBO (☝ Refer to helper function above)
indexBuffer = createBuffer(indices);

Note the use of a TypedArray for the variable indices. WebGL expects typed arrays for data buffers. We're opting to use 16 bit unsigned integers to specify our indices, which is faster than using 32 bit integers, however there's a limit to how many vertices you can reference, the maximum value of a 16 bit unsigned int.

Vertex Shader

A Vertex Shader is a GPU program that executes on every vertex of what you’re currently drawing. Often times developers will place code that handles positioning geometry here.

attribute vec3 inPosition;
attribute vec3 inColor;
varying vec3 vColor;void main()
vColor = inColor;
gl_Position = vec4(inPosition, 1.0);

And do the following to create a vertex shader in JavaScript:

// ✋ Declare Vertex Shader Handle
let vertModule: WebGLShader = null;
// 🕸️ Vertex Shader Source
const vertShaderCode = `
attribute vec3 inPosition;
attribute vec3 inColor;
varying vec3 vColor;void main()
vColor = inColor;
gl_Position = vec4(inPosition, 1.0);
// 👋 Helper function for creating WebGLShader(s) out of strings
let createShader = (source: string, stage) => {
// ⚪ Create Shader
let s = gl.createShader(stage);
// 📰 Pass Vertex Shader String
gl.shaderSource(s, source);
// 🔨 Compile Vertex Shader (and check for errors)
// ❔ Check if shader compiled correctly
if (!gl.getShaderParameter(s, gl.COMPILE_STATUS)) {
console.error('An error occurred compiling the shader: ' + gl.getShaderInfoLog(s));
return s;
vertModule = createShader(vertShaderCode, gl.VERTEX_SHADER);

Fragment Shader

A Fragment Shader executes on every fragment. A fragment is like a pixel, but not limited to just 8 bit RGB, there could be multiple attachments that you’re writing to, in multiple encoded formats.

precision mediump float;varying highp vec3 vColor;void main()
gl_FragColor = vec4(vColor, 1.0);

And do the following to create a fragment shader in JavaScript:

// ✋ Declare Vertex Shader Handle
let fragModule: WebGLShader = null;
// 🟦 Fragment Shader Source
const fragShaderCode = `
precision mediump float;
varying highp vec3 vColor;void main()
gl_FragColor = vec4(vColor, 1.0);
// ☝️ Refer to vertex sample above
fragModule = createShader(fragShaderCode, gl.FRAGMENT_SHADER);

For more information on shader languages, check out this post comparing all shader languages across graphics apis.


A Shader Program binds the vertex and fragment shaders together and sets them up to be used in the WebGL state machine.

// ✋ Declare Program Handle
let program: WebGLProgram = null;
// 👋 Helper function for creating WebGLProgram(s) out of WebGLShader(s)
let createProgram = (stages: WebGLShader[]) => {
let p = gl.createProgram();
for (let stage of stages) {
gl.attachShader(p, stage);
return p;
program = createProgram([vertModule, fragModule]);


Uniforms are variables that you send to your shader program to adjust its output.

All you need to do is declare a uniform in your shader:

uniform vec4 uColor;

And do the following to send your requested data to your shader:

// 👔 Uniform Data
let color = new Float32Array([0.5, 0.5, 0.5, 1.0]);
// 🔎 Get Uniform Location
let uniformLocation = gl.getUniformLocation(program, 'uColor');
// 🟢 Assign Data
gl.uniform4v(uniformLocation, color);


Textures are image data structures that you can use as inputs to a shader or as frame buffer attachments.

// 🖼️ Load Textures
function loadTexture(url: string) {
let tex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, tex);
const pixel = new Uint8Array([ 255, 255, 255, 255 ]);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 1, 1, 0, gl.RGBA, gl.UNSIGNED_BYTE, pixel);
let img = new Image();
img.src = url;
img.onload = () => {
gl.bindTexture(gl.TEXTURE_2D, tex);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, img);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
return tex;

And once they’re created, you can send access them in your shader like so:

uniform sampler2D tAlbedo;

And do the following to send it to your shader:

// ✋ Declare Uniform Location Handle
let fragAlbedoUniform: number = -1;
// 🔎 Get Uniform Location
fragAlbedoUniform =
gl.getUniformLocation(program, "tAlbedo");
// 🌳 Set the current active Texture ID
// 🟢 Set texture object to GL Texture ID
gl.bindTexture(gl.TEXTURE_2D, tAlbedo);
// Set a uniform int as the current texture ID being referenced
gl.uniform1i(fragAlbedoUniform, 0);


To draw, call either gl.drawArrays to draw based off a specified pattern of how your vertex buffer is organized, or gl.drawElements if you're using an index buffer. More often than not you'll be using gl.drawElements.

// ✋ Declare animation handler
let animationHandler: number = 0;
// 🔺 Render triangle
const render = () => {
// 🖌️ Encode drawing commands
gl.viewport(0, 0, canvas.width, canvas.height);
gl.scissor(0, 0, canvas.width, canvas.height);
// Bind Vertex Layout
let setVertexBuffer = (buf: WebGLBuffer, name: string) => {
gl.bindBuffer(gl.ARRAY_BUFFER, buf);
let loc = gl.getAttribLocation(program, name);
gl.vertexAttribPointer(loc, 3, gl.FLOAT, false, 4 * 3, 0);
setVertexBuffer(positionBuffer, 'inPosition');
setVertexBuffer(colorBuffer, 'inColor');
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
gl.drawElements(gl.TRIANGLES, 3, gl.UNSIGNED_SHORT, 0);
// ➿ Refresh canvas
animationHandler = requestAnimationFrame(render);
// 🏎️ Start the Rendering Engine


Even though the JavaScript runtime features garbage collection, GL Objects will not be garbage collected over the course of an application’s lifetime (so as long as you’re on the web page). If you want to get rid of a buffer, texture, frame buffer, etc. You’ll need to call gl.destroy...(handle), where the ... could be a Buffer, Framebuffer, Program, Renderbuffer, Shader, or Texture.

// 💥 Destroy Buffers, Shaders, Programs
function destroyResources()


While a bit different from other Web APIs, WebGL’s interface can be surprisingly intuitive, and everything you do in it can translate to other languages such as C++, C, Rust, etc. in the form of OpenGL.

Now there were a few things I didn’t cover in this post as they would have been a beyond the scope of this post, such as:

  • Frame Buffers
  • Cube Maps
  • glDrawArrays options
  • Blend Modes
  • Matrices

Not to mention aspects of software engineering such as project organization, WebAssembly implementations, game engine architecture, real time renderer architecture, etc.

Additional Resources

You’ll find all the source code described in this post in the GitHub repo here.



Alain Galvan

https://Alain.xyz | Graphics Software Engineer @ AMD, Previously @ Marmoset.co. Guest lecturer talking about 🛆 Computer Graphics, ✍ tech author.