The following are demo apps I created for the purpose of learning ThreeJS, WebGL, 3D math, and Blender.
Create a clone of the classic arcade game Asteroids.
Learn how to create original 3D assets in Blender from scratch and export them so that they can be imported into a ThreeJS web app.
See Blender Adventures post.
Implement a shader that can be used to render a Star Wars-inspired hologram.
Learn to write a GLSL shader.
This was my first time creating my own lighting. Since a hologram is intentionally not photorealistic, I decided I could use simple Lambertian reflectance alone. For this I used a hard-coded light direction and the normal vector from the vertex shader:
vec3 lightDirection = normalize(vec3(0.7, 0.5, 1.0));
float diffuse = max(dot(vNormal, lightDirection), 0.0);
I also learned to convert vectors from screen space to world space. In Star Wars, holograms have noise (presumably due to poor transmission). These can distort the image dramatically. We want a hologram's mesh geometry to occasionally skew and distort in a chaotic, but controlled, way. Try changing the "wiggle" settings in the demo to see which effect I am referring to. I offset a given mesh's vertices by a "noise shift" vector representing the amount of skewing caused by transmission noise. To make this shift look proper from every angle, I decided we should move the vertices in the distorted sections of the mesh to the viewer's left and right. This is in contrast to another option: Moving the geometry vertexes along a world space axis, e.g. east-west, to distort them. To figure out the new, distorted location for a vertex, I used the inverse view matrix to find a point in 3D world space to the viewer's left and right of the vertex's ordinary location.
vec4 noiseShift = inverseViewMatrix * vec4(wiggleFactor * sin(x / 3.0) * sin(x / 13.0), 0.0, 0.0, 0.0);
vec3 shiftedPosition = noiseShift.xyz / 7.0 + position;
vec4 mvPosition = modelViewMatrix * vec4(shiftedPosition, 1.0);
gl_Position = projectionMatrix * mvPosition;
Reflect an object across a 3D plane using matrix transformations.
Exercise my understanding of matrices and other 3D math concepts.
Historically, a common solution for rendering reflections, e.g. an object reflected in a mirror, is to render the object a second time with its vertices transformed to the mirrored virtual location. I wanted to test my math skills with respect to 3D graphics/linear algebra by seeing if I could figure out how to do this transformation on my own. The "mirror" plane (red in the demo) translates and rotates within the 3D world. My solution ended up looking like the following (reflection
is the reflected torus knot object).
const reflectionMatrix = new Matrix4()
.multiply(planeGroup.matrix) // Convert from plane-local space to world space.
.multiply(new Matrix4().makeScale(1, 1, -1)) // Mirror it.
.multiply(new Matrix4().getInverse(planeGroup.matrix)) // Convert world space to plane-local space.
.multiply(meshMatrixCopy); // Convert mesh-local space to world space.
reflection.matrix = reflectionMatrix;
In short, I discovered you can solve this problem by converting the mesh's world space coordinates to "mirror space" (a model space that is local to the mirror) and do the Z coordinate flipping (the mirroring operation) in "mirror space" coordinates.
Modify the ThreeJS example reflector to support blurred reflections.
Learn to apply post-processing effects to texture render targets. Very few materials, except mirrors, reflect objects perfectly. Blurring a reflection slightly adds a level of realism.
I had applied post-processing effects in ThreeJS before, but those effects were applied to the final buffer rendered to the screen. The new challenge for me here was applying a post-processing effect to an intermediate texture render target. (Reflected objects are mirrored over the reflective plane and rendered separately to a texture.) I needed to apply my blur filter to that texture. Originally I tried implementing a convolution filter in a shader. I later realized I could use ThreeJS' existing bloom pass filter as my blur filter.
Render an asset from a game I created in 2008.
Learn to render a glTF file in ThreeJS. Learn updated Blender UI.
In high school I created a game using the Irrlicht Engine. I created some of the assets myself in Blender and Photoshop.
The last time I used Blender was before Blender 2.8. Version 2.8 introduced tabbed workspaces at the top of the user interface. I found this super helpful for switching between tasks like modeling and UV editing. I ended up modifying the asset's UV map before re-exporting it as a .gltf
file. The problem with the original asset is the texture was stretched along circumference of the barrel. I changed the UV coordinates to repeat the texture twice in a mirrored fashion around the barrel's circumference. While this adds a repeated pattern around the barrel, it causes more texels to be used per surface area of the barrel. This results in an apparent doubling of image resolution and a higher quality looking texture.
Create a text slide that leverages real-time 3D graphics.
Solve a real-world problem using ThreeJS.
My past employer, Logos, creates Proclaim—a church presentation software application—and sells media to be used on-screen in churches, e.g. this countdown. Currently most of Proclaim's graphics/media are dynamic 2D slides or static, pre-rendered 3D slides. What if instead Proclaim could leverage a broadly supported graphics API like WebGL or OpenGL ES to make its 3D content dynamic? Churches could modify the 3D graphics with custom text or other options. Proclaim could use CEF to embed a WebGL page in the app and render WebGL scenes to a projector. Faithlife's digital signage system is already browser-based so the WebGL scenes should work for signs as well.
Create an application that lets you toggle meshes' transparencies, depth test usage, and sorting.
Verify my understanding of how ThreeJS solves the problem of rendering multiple objects when some of the objects are translucent.
Prior to creating this demo, I discovered that sometimes transparent objects don't play nice with depth buffers and other transparent objects. To illustrate, ask yourself: How would you depth test opaque objects against translucent ones? What about when you're trying to draw an opaque object behind a translucent one? An ordinary depth test tries to avoid drawing farther objects on top of nearer ones.
By building a demo where I can deconstruct and break things intentionally, I verified ThreeJS's solution to this problem:
Note that this strategy is not 100% effective because objects can occlude themselves.
Extrude an SVG file into a 3D object.
Gain familiarity with ThreeJS geometry APIs like ShapeBufferGeometry
and ExtrudeGeometry
.
When building geometry from scratch, you don't get a lot for "free." ExtrudeGeometry
for example doesn't generate normals that result in smooth shading.
Create a scene with several shadow-casting and shadow-receiving objects.
Learn ThreeJS basics.
This is one of the very first things I created with ThreeJS. I used this scene as an opportunity to learn ThreeJS' core APIs and learn what its performance may be like on old/mobile hardware when rendering shadows with very few shadow-casting lights.