-
Notifications
You must be signed in to change notification settings - Fork 2
[Brainstorming Archive] Dave's Proposal
Positions and rotations in 3D space are unintuitive for people who aren't well versed in geometry. Artists are instead more used to thinking about space tangibly. Typically, when making sculpture additively, one will start with an armature, which is a skeleton of the final creation, and then and then attach things to build up the final form. Armatures are posed by holding certain parts and moving others. Pieces are connected to joints.
Positioning of elements is done by connecting armatures. This is like creating a stick figure that you will eventually attach actual geometry to. Armature segments are defined by named joints that things can "connect" to. We will provide a library of common armature elements, but implemented with the same public API people would use if they wanted to make their own.
In general, definitions take in a function to be run, so that the definitions can make use of dynamic programming constructs (e.g. loops) if desired.
const Bone = Armature(() => {
Joint("root", {x: 0, y: 0, z: 0});
Joint("leaf", {x, 0, y: 1, z: 0});
});
You can connect armatures together to make a model. Connections are effectively just translations, expressed in a more intuitive manner. A model is initially passed a joint called base
(the origin) for you to attach things to.
const Tower = Model((base) => {
const a = Bone();
const b = Bone();
const c = Bone();
a.joint("root").connect(base);
b.joint("root").connect(a.joint("leaf"));
c.joint("root").connect(b.joint("leaf"));
});
You can change the orientation of joints when connecting them. Rotation commands are incremental and are applied immediately. But instead of specifying rotation axes and amounts, you can "hold" a joint to fix it in place and then point another joint at a target. Mathematically, here's what happens:
- If
a
is held andb
is told to point atc
, the rotation vector(c - a) - (b - a)
is applied. - If two points
a
andb
are held, andc
is told to point atd
, the rotation vector isd - a - proj(d - a, b - a) - (c - a - proj(c - a, b - a))
(meaning, it rotates around theb - a
axis to point as much as it can towardsd
) - If more than two points are held, an error is thrown
Any arbitrary points can be held, as well as joints. We provide points in general directions to rotate towards (e.g. LEFT
is defined as Point({x: -Infinity, y: 0, z: 0})
.)
If an amount between 0 and 1 is passed to pointAt
, then the SLERP algorithm is used to interpolate between the current orientation and the final orientation by the given amount.
Armatures are posed in absolute coordinate space, but once connected, rotations on their parents are applied to them. If an artist wants to work entirely in relative coordinate space, one needs to connect armatures first, and then rotate the armatures starting from the children up to the root. If an artist wants to only work in absolute coordinate space, armatures are rotated individually, and then connected from the root to the children.
const Snake = Model((base) => {
const head = Bone();
head.joint("root").connect(base);
let last = head;
range(5).forEach((i) => {
const next = Bone();
next.joint("root").connect(last.joint("leaf"));
next.hold(next.joint("root"));
// pointToward uses absolute rotations. When passed an amount, it
// uses SLERP to rotate towards the given direction by a given amount
next.joint("leaf").pointAt(pick([LEFT, RIGHT]), random(0, 0.5));
next.release(next.joint("root"));
last = next;
});
// Since other joints are already connected, they get rotated too
head.joint("leaf").pointAt(RIGHT);
head.joint("root").connectTo(base);
});
You can also stretch towards a point. In addition to rotating, it also stretches by the magnitude of the rotation vectors defined above (i.e., if you have one held point and you call stretchTo
, the point being stretched ends up at exactly the point being stretched to.)
const TallTower = Model((base) => {
head.joint("root").connectTo(base);
head.hold(base);
head.joint("leaf").stretchTo(Point({x: 0, y: 100, z: 0});
head.release(base);
});
By default, stretching scales uniformly. You can optionally specify the stretch mode as a second argument in stretchTo
:
-
Uniform
, the default, scales all axes uniformly -
Squash(volume)
will scale the other axes such that if the shape volume started atvolume
originally, after scaling, the same volume is preserved.
Given a base armature definition, one can create an instance of one that can be posed separately from other instances. When posing an armature instance, you can introspect all its joints. This part is currently the least polished. Please feel free to suggest better ways of introspection!
const worm = Snake();
snake1.hold(snake1.base());
const tail = snake1.base().connections()[0].joint("leaf");
tail.pointTowards(DOWN);
const snake = Snake(); // is still in the default pose
Artists should be able to focus just on creating armatures before they worry about any of their shapes. You can pass a flag to the draw command to show a default visualization of bones (possibly just lines, or if we want to get fancy, octahedrons, so that we can show volume.)
draw([snake], { showArmatures: true });
Like armatures, we should provide a library of default shapes, but all using our public API. Shapes are sets of vertices and all the additional attributes needed to render the shapes. Vertex
, Normal
, and Elements
are default methods to insert attributes and elements into the shape.
// Shapes can take in named params with default values
const Sphere = Shape(({numLat = 20, numLong = 20}) => {
// algorithm based off of http://learningwebgl.com/blog/?p=1253
// Generate vertices and normals
range(numLat).forEach((lat) => {
const theta = lat * Math.PI / numLat;
range(rumLong).forEach((long) => {
const phi = long * 2 * Math.PI / numLong;
const x = Math.cos(phi) * Math.sin(theta);
const y = cos(theta);
const z = Math.sin(phi) * Math.sin(theta);
Normal({x, y, z}); // All vertices get a normal with a value equal to the last call to Normal
Vertex({x, y, z});
});
});
// Generate elements array
range(numLat).forEach((lat) => {
range(rumLong).forEach((long) => {
const first = lat * (numLong + 1) + long;
const second = first + numLong + 1;
// Can set multiple indices at once, making it more readable to group into triangles
Elements(first, second, first + 1);
Elements(second, second + 1, first + 1);
});
});
});
You can connect arbitrary points in space to joints in the armature to define how shapes get attached. This can happen in an armature definition, or after it has been instantiated:
const sphere = Sphere();
sphere.point({x: 0, y: -1, z: 0}).connect(snake.base);
sphere.hold({x: 0, y: -1, z: 0});
sphere.point({x: 0, y: 1, z: 0}).stretchTo(tail);
sphere.releaseAll(); // convenience method so you don't have to keep a reference to the point
If a shape used Joint()
in its definition to create named joints, those can be referred to as well, instead of a point literal.
A key difference between shapes and armatures is that when you transform a shape, it has no children to affect. If a rotation is intended to propagate to children, the armature should be transformed, not the shape. Shape translations are used entirely for connecting to armatures.
Materials are basically fragment shaders. Shapes have to define all the information required for a material to be able to render it. The default function Normal
is essentially an alias for defining an attribute storing the vertex normal. We should specify a default material:
const Lambertian = Material({
attributes: ["normal"],
uniforms: {color: () => [1, 1, 1]}, // specify a default value for uniforms
shader: `
precision mediump float;
uniform vec3 color;
varying vec3 normal;
varying vec3 position;
void main() {
vec3 lightPosition = vec3(20.0, 20.0, 20.0);
vec3 lightDir = lightPosition - position;
float lambertian = max(dot(lightDir, normal), 0.0);
gl_FragColor = vec4(lambertian * color, 1.0);
}
`
});
const Triangle = Shape(({color = [1, 1, 1]}) => {
Uniform("color", color);
Normal({x: 0, y: 0, z: 1}); // basically: Attribute("normal", [0, 0, 1]);
Vertex({x: -1, y: 0, z: 0});
Vertex({x: 0, y: 1, z: 0});
Vertex({x: 1, y: 0, z: 0});
}, Lambertian); // Specifying a material manually
Uniforms can be left unset in a shape, with the expectation that they can be set outside the shape. This lets us do things like make a function to set the lights and have that be used in all subsequent material draws. Because they don't need to be specified in a shape, there is the potential that they never get set, so a default value must be specified in the material definition.
We should provide a default shader that uses Lambertian diffuse + Phong specular highlights, and functions to conveniently fill in the right uniform values:
// Translates to a Uniform() call to set a vec3[] of light positions and an int NUM_LIGHTS on our material
Lights([{x: 1, y: 1, z: 1}, {x: 2, y: 2, z: 2}]);
// Translates to Uniform("color", [1, 1, 1]);
Color("#FFFFFF");