File: ascii-shader-ogl.md | Updated: 11/15/2025
Free course recommendation: Master JavaScript animation with GSAP through 34 free video lessons, step-by-step projects, and hands-on demos. Enroll now â
Iâve started using shaders as a powerful mechanism for creating cool generative art and building performant animations. One cool thing you can do with shaders is use the output from one as the input to another. Doing so can give you some really interesting and amazing effects.
In this article, Iâll walk you through how to do exactly that.
By following along with the article, you will:
By the end, youâll have built this awesome looking shader:
The finished shader. A cool lava lamp-like effect thatâs generated using ASCII characters.
Iâve always liked ASCII art, and I think it stemmed from being a young gamer in the early 2000s. The fan-made walkthrough guides I used to use would often display the logo of the game using ASCII art, and I always loved it. So this article is a love letter to the unsung ASCII art heroes from the turn of the millennium.

The âKingdom Heartsâ logo built using ASCII characters
Note: Iâll be using OGL to render the shader. If you havenât used it before, itâs a lightweight alternative to Three.js. Itâs not as feature-rich, but it can do a lot of cool shader + 3D work while being 1/5th of the size.
Itâs worth having a little experience using shaders, to understand what they are, the differences between a vertex and fragment shader, etc. Since Iâll be creating the project from scratch, itâs recommended that youâre comfortable using the terminal in your preferred code editor of choice, and are comfortable writing basic HTML, CSS, JavaScript.
You can still follow along even if you havenât had any experience, Iâll guide you step-by-step in creating the shader from scratch, focusing on building the project without diving too deeply into the fundamentals.
Learn modern web animation using GSAP 3 with 34 hands-on video lessons and practical projects â perfect for all skill levels.
Weâll create two shaders. Instead of rendering the first shader to an HTML canvas (which is the default behaviour), weâll store the rendered data in memory. Since it will be stored inside of a variable, we can then pass it to the second shader. The second shader will be able to
The setup required for this is relatively straight forward, so weâll create the project from scratch.
Start by creating an empty directory and navigate into it. Run the following commands in your terminal:
npm init
npm i ogl resolve-lygia tweakpane vite
touch index.html
Open up your package.json file and update the scripts object:
"scripts": {
"dev": "vite"
}
Finally kick off your dev server using npm run dev
Before opening the browser, youâll need to paste the following into your index.html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Document</title>
<style>
body {
margin: 0;
}
canvas {
display: block;
}
</style>
</head>
<body>
<script type="module" src="./main.mjs"></script>
</body>
</html>
The above just adds the bare minimum markup to get something working. Finally create a new file main.mjs in the root directory and add a simple console.log("hello world").
Open the browser on the assigned port, open the console and you should see âhello worldâ
Before playing around with noise and ASCII generators, letâs write a simple shader. Doing so will lay the plumbing needed to add more complex shaders.
In the main.js file, import the following classes from OGL:
import {
Camera,
Mesh,
Plane,
Program,
Renderer,
} from "ogl";
The first thing weâll need to do to is initialise the render. Doing so creates a default canvas element, which we then append to page.
const renderer = new Renderer();
const gl = renderer.gl;
document.body.appendChild(gl.canvas);
The next step is to create a camera, which renders the scene as a human would see it. We need to pass through a couple of settings, like the boundaries of the view and the position of the camera.
const camera = new Camera(gl, { near: 0.1, far: 100 });
camera.position.set(0, 0, 3);
Itâs also worth explicitly setting the width and height of the canvas. Weâll create a function that does this. Weâll then invoke it and attach it to the resize event listener.
function resize() {
renderer.setSize(window.innerWidth, window.innerHeight);
camera.perspective({ aspect: gl.canvas.width / gl.canvas.height });
}
window.addEventListener("resize", resize);
resize();
Letâs now create our very first shader. Weâll use OGLâs Program class, which is responsible for linking the vertex and fragment shaders. Itâs also responsible for initialising uniform values, values we can dynamically update and pass through to our shader code.
Finally, and most importantly, itâs responsible for compiling our shader. If thereâs a build-time error with the code, it will display warning in the console and not compile.
const program = new Program(gl, {
vertex: `#version 300 es
in vec2 uv;
in vec2 position;
out vec2 vUv;
void main() {
vUv = uv;
gl_Position = vec4(position, 0.f, 1.f);
}`,
fragment: `#version 300 es
precision mediump float;
uniform float uTime;
in vec2 vUv;
out vec4 fragColor;
void main() {
float hue = sin(uTime) * 0.5f + 0.5f;
vec3 color = vec3(hue, 0.0f, hue);
fragColor = vec4(color, 1.0f);
}
`,
uniforms: {
uTime: { value: 0 },
},
});
Weâre passing through three options to our program, a vertex shader, a fragment shader, and the uniform values.
This shader wonât work just yet. If you look inside the fragment code, you may notice that the hue changes based on the current time. The hue value determines the amount of red and blue weâre adding to the pixel, since the color variable sets the RGB value for the fragment.
Now we next need to create a Plane geometry. This is going to be a 2D rectangle the covers the screen. To do this, we just need to pass through the following options:
const geometry = new Plane(gl, {
width: 2,
height: 2,
});
Now we need to combine the shader and the geometry program. This is achieved by using OGLâs Mesh class. Creating an instance of a mesh gives us a model that we can render to the screen.
const mesh = new Mesh(gl, { geometry, program });
Now that we have everything we need to render our shader, we need to create a render loop which runs and renders the shader code on each frame. We also need to increase the elapsed time and update the uniform value. Without it, the sin function would return the same value on every frame.
function update(t) {
requestAnimationFrame(update);
const elapsedTime = t * 0.001;
program.uniforms.uTime.value = elapsedTime;
renderer.render({ scene: mesh, camera })
}
requestAnimationFrame(update);
If you open up you browser, you should see a shader that fluctuates between purple and black.
A simple shader showing the browser slowly oscillating between purple and black
If your code isnât rendering at all, or not as expected, go through the instructions a couple more times. OGL is also good at displaying compilation errors in the browserâs dev console, so itâs worth having it open and trying to understand exactly whatâs going wrong.
The below shows a screenshot of a warning outputted by OGL when a statement in the shader cade doesnât end with a semicolon.

Console warnings saying that the shader has not been compiled, while giving hints as to what the error may be
There are a few things to note here:
Fragment shader is not compiled â This indicates a build time issue, so thereâs likely a problem with the syntax of your code, not a run time issueError: 0:6: 'in' : syntax error â This indicates an error on line 6. While line 6 itself is fine, you can see that line 4 hasnât ended with a semi colon, which breaks the next line of code.The error messages can be a little esoteric, so it may require a little investigating to resolve the problem you might come across. And itâs likely that youâll come across some issues as there are LOTS of gotchas when it comes to writing shaders.
If you havenât written shader code before, thereâll be a few things thatâll keep tripping you up.
Iâd recommend installing the WebGL GLSL Editor extension to give you syntax highlighting for the GLSL files.
Since this isnât a deep dive in to the GLSL language, I wonât spend too much time around the syntax, but there are things to be aware of:
float, vec2, vec3, and vec4 types in this article.OGL does a good job of displaying error messages in the console when thereâs a compilation error. Itâll usually point you in the right direction if thereâs a problem. In fact, hereâs some broken GLSL. Replace it with your existing program variable and try and resolve the issues using the console to guide you:
const program = new Program(gl, {
vertex: `#version 300 es
in vec2 uv;
in vec2 position;
out vec2 vUv;
void main() {
vUv = uv;
gl_Position = vec4(position, 0.f, 1.f);
}`,
fragment: `#version 300 es
precision mediump float;
uniform float uTime
in vec2 vUv;
out vec4 fragColor;
void main() {
float hue = sin(uTime) * 0.5f + 0.5f;
vec2 color = vec3(hue, 0.0f, hue);
fragColor = vec4(color, 1);
}
`,
uniforms: {
uTime: { value: 0 },
},
});
Try your best to resolve all the errors in fragment using the console warnings, though Iâll provide the solutions in the line before:
uniform float uTime requires a semi-color at the endvec2 color = vec3(hue, 0.0f, hue); has an incorrect type in the variable definition. It should be a vec3 not a vec2.fragColor = vec4(color, 1) fails because 1 is an integer, not a float, which is the type that weâve specified for variable fragColorNow that weâve set up all the boilerplate to render a shader, letâs go ahead and convert our purple shader over to something more interesting:
A Perlin noise shader that shows a lava lamp style effect, using highly saturated colors, like bright red, green, blue, and purple
Weâll start by creating files for our shaders and copying and pasting the inline code into these files.
Create a vertex.glsl file and cut/paste the inline vertex shader into this file
Create a fragment.glsl file and do the same.
Note: Itâs important that the #version statements are on the very first line of the file, otherwise the browser wonât be able to compile the GLSL files.
Since Vite handles the importing of plain text file, we can go ahead and import the fragment and vertex shaders directly within our JS file:
import fragment from "./fragment.glsl?raw";
import vertex from "./vertex.glsl?raw";
Now update the Program constructor to reference these two imports
const program = new Program(gl, {
vertex,
fragment,
uniforms: {
uTime: { value: 0 },
},
});
If everythingâs been moved over correctly, the browser should still be rendering the purple shader.
Now that weâve finished our set up, weâre going to create a more interesting shader. This oneâs going to use a Perlin noise algorithm to generate natural feeling movements.
These kind of algorithms are commonly used when creating water effects, so itâs handy to have them in your shader toolbelt.
If youâre interested in learning more about Perlin noise, or noise algorithms in general. This Book of Shaders chapter is worth the read. Fun fact, Perlin noise was created by Ken Perlin to generate realistic textures using code, which he needed for the Disney movie Tron.
Weâre also going to start passing through more uniform values.
const program = new Program(gl, {
vertex,
fragment,
uniforms: {
uTime: { value: 0 },
+ uFrequency: { value: 5.0 },
+ uBrightness: { value: 0.5 },
+ uSpeed: { value: 0.75 },
+ uValue: { value: 1 },
},
});
Jump into the fragment.glsl file, delete everything inside of it, and paste in the following.
#version 300 es
precision mediump float;
uniform float uFrequency;
uniform float uTime;
uniform float uSpeed;
uniform float uValue;
in vec2 vUv;
out vec4 fragColor;
#include "lygia/generative/cnoise.glsl"
vec3 hsv2rgb(vec3 c) {
vec4 K = vec4(1.0f, 2.0f / 3.0f, 1.0f / 3.0f, 3.0f);
vec3 p = abs(fract(c.xxx + K.xyz) * 6.0f - K.www);
return c.z * mix(K.xxx, clamp(p - K.xxx, 0.0f, 1.0f), c.y);
}
void main() {
float hue = abs(cnoise(vec3(vUv * uFrequency, uTime * uSpeed)));
vec3 rainbowColor = hsv2rgb(vec3(hue, 1.0f, uValue));
fragColor = vec4(rainbowColor, 1.0f);
}
Thereâs a lot going on here, but I want to focus on two things
For starters, if you look at the main function you can see the following:
cnoise function to generate the hue of the pixel.Secondly, weâre importing the cnoise function from a helper library called Lygia
.
Our GLSL file doesnât have access to the Lygia helpers by default so we need to make a couple of changes back in the main.mjs file. You need to import resolveLygia and wrap it around the shaders that need access to Lygia modules
import { resolveLygia } from "resolve-lygia";
// rest of code
const program = new Program(gl, {
fragment: resolveLygia(fragment),
// rest of options
});
With that completed, you should be able to see a shader that has a natural feeling animation.
A Perlin noise shader that shows a lava lamp style effect, using highly saturated colors, like bright red, green, blue, and purple
It might not look and feel perfect, but later on weâll integrate the mechanism thatâll allow you to easily tweak the various values.
Now that weâve created our first shader, letâs create an ASCII shader that replaces the pixels with an ascii character.
Weâll start by creating the boilerplate necessary for another shader.
Create a new file called ascii-vertex.glsl and paste the following code:
#version 300 es
in vec2 uv;
in vec2 position;
out vec2 vUv;
void main() {
vUv = uv;
gl_Position = vec4(position, 0., 1.);
}
You may have noticed that itâs exactly the same as the vertex.glsl file. This is common boilerplate if you donât need to play around with any of the vertex positions.
Create another file called ascii-fragment.glsl and paste the following code:
#version 300 es
precision highp float;
uniform vec2 uResolution;
uniform sampler2D uTexture;
out vec4 fragColor;
float character(int n, vec2 p) {
p = floor(p * vec2(-4.0f, 4.0f) + 2.5f);
if(clamp(p.x, 0.0f, 4.0f) == p.x) {
if(clamp(p.y, 0.0f, 4.0f) == p.y) {
int a = int(round(p.x) + 5.0f * round(p.y));
if(((n >> a) & 1) == 1)
return 1.0f;
}
}
return 0.0f;
}
void main() {
vec2 pix = gl_FragCoord.xy;
vec3 col = texture(uTexture, floor(pix / 16.0f) * 16.0f / uResolution.xy).rgb;
float gray = 0.3f * col.r + 0.59f * col.g + 0.11f * col.b;
int n = 4096;
if(gray > 0.2f)
n = 65600; // :
if(gray > 0.3f)
n = 163153; // *
if(gray > 0.4f)
n = 15255086; // o
if(gray > 0.5f)
n = 13121101; // &
if(gray > 0.6f)
n = 15252014; // 8
if(gray > 0.7f)
n = 13195790; // @
if(gray > 0.8f)
n = 11512810; // #
vec2 p = mod(pix / 8.0f, 2.0f) - vec2(1.0f);
col = col * character(n, p);
fragColor = vec4(col, 1.0f);
}
Credit for the ASCII algorithm goes to the author of this shader in ShaderToy . I made a few tweaks to simplify it, but the core of it is the same.
As I mentioned at the top, it calculates the amount of grey in each 16Ă16 square and replaces it with an ascii character.
The texture function allows us to get the fragment color from the first shader. Weâll pass this through as a uniform value from within the JavaScript file. With this data, we can calculate the amount of grey used in that pixel, and render the corresponding ASCII character.
So letâs go ahead and set that up. The first step is to create a new program and a mesh for the ASCII shader. Weâll also reuse the existing geometry.
After that, youâll need to make a few tweaks inside of the update function. Youâll need to pass through the screen size data, as the ASCII shader needs that information to calculate the dimensions. Finally, render it just like the other scene.
import asciiVertex from './ascii-vertex.glsl?raw';
import asciiFragment from './ascii-fragment.glsl?raw';
const asciiShaderProgram = new Program(gl, {
vertex: asciiVertex,
fragment: asciiFragment,
});
const asciiMesh = new Mesh(gl, { geometry, program: asciiShaderProgram });
// Rest of code
function update(t) {
// existing rendering logic
const width = gl.canvas.width;
const height = gl.canvas.height;
asciiShaderProgram.uniforms.uResolution = {
value: [width, height],
};
renderer.render({ scene: asciiMesh, camera });
}
Nothingâs going to happen just yet, since weâre not passing through a texture to the ASCII shader, so the shader will error. The next step is to render the first shader and store the results in memory. Once weâve done that, we can pass that data through to our ASCII shader. We can do this by creating an instance of a FrameBuffer, which is a class provided by OGL. The rendered data of our shader gets stored within the frame buffer.
import {
// other imports
RenderTarget,
} from "ogl";
// Renderer setup
const renderTarget = new RenderTarget(gl);
const asciiShaderProgram = new Program(gl, {
vertex: asciiVertex,
fragment: asciiFragment,
+ uniforms: {
+ uTexture: {
+ value: renderTarget.texture,
+ },
+ },
});
function update(t) {
// existing code
- renderer.render({ scene: mesh, camera });
+ renderer.render({ scene: mesh, camera, target: renderTarget });
// existing code
}
Once thatâs done, youâre ASCII shader should be working nicely.
The finished shader. A cool lava lamp-like effect thatâs generated using ASCII characters.
Whatâs particularly fun about creating shaders is endlessly tweaking the values to come up with really fun and interesting patterns.
You can manually tweak the values inside of the glsl files directly, but itâs much less hassle to use a control pane instead.
Weâll use Tweakpane
for our control panel. Getting it set up is a breeze, just import the Pane class, create an instance of a pane, and then add bindings to the shader uniform values.
Remember those uniform values we passed through to the fragment shader earlier? Letâs bind those values to the control pane so we can tweak them in the browser:
import { Pane } from 'tweakpane';
// Just before the update loop
const pane = new Pane();
pane.addBinding(program.uniforms.uFrequency, "value", {
min: 0,
max: 10,
label: "Frequency",
});
pane.addBinding(program.uniforms.uSpeed, "value", {
min: 0,
max: 2,
label: "Speed",
});
pane.addBinding(program.uniforms.uValue, "value", {
min: 0,
max: 1,
label: "Lightness",
});
Now you can play with the values and see everything update in real time.
The Perlin ASCII shader changes in real time as the user plays with the settings in the control panel. The speed of the noise increases and decreases, as does the brightness of the shader.
I hope you had some fun exploring shaders. Donât sweat it if you find shaders a little confusing. Iâve found them to be incredibly humbling as a developer, and Iâm still only scratching the surface of what theyâre capable of.
By becoming more familiar with shaders, youâll be able to create unique and performant animations for your web experiences.
Also, if youâre interested in learning more about web development, then consider checking out my course Component Odyssey . Component Odyssey will teach you everything you need to build and publish your very own component library. Youâll learn a ton thatâll serve you in your future frontend projects.

Andrico Karoulla is a London-based web developer who has worked at several startups, building everything from design system automation tools at Anima to open-source contributions with projects like Open UI, where he led the site rewrite. He also offers Component Odyssey , a course for developers ready to build framework-agnostic component libraries that are accessible, customizable, and ready for real-world use.
Build Your Own Tools with Penpotâs New Plugin System and Join the Contest
Recreating the Mac Mini Effect Step-by-Step in Rive
đ⨠Discover fresh gems in our handpicked exhibition of standout websites that caught our eye.