Skip to content

Better Model API #858

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
bjornbytes opened this issue Mar 19, 2025 · 3 comments
Open

Better Model API #858

bjornbytes opened this issue Mar 19, 2025 · 3 comments

Comments

@bjornbytes
Copy link
Owner

bjornbytes commented Mar 19, 2025

There are some limitations of the Model API that would be nice to improve:

  • There isn't a very easy way to manually render the individual nodes/meshes of a model. This can be useful if you want to use different shaders, materials, or render states when rendering some nodes, or just do the rendering yourself. Technically it's possible. You can get all the meshes in a model, you can walk the node tree and use ModelData:getNodeMeshes(node) to see which meshes to render. There are some downsides:
    • It's a little more cumbersome than I'd like.
    • It does not work with animations.
  • Using your own materials with a Model is pretty hard. People often have a simple OBJ model with 1 node and mesh, and want to load the texture manually and use it with the Model. It seems like this should work with Pass:setMaterial(texture), but the model doesn't use the pass's material -- it uses its own material (which is usually just plain white). There is a workaround where you can load the model with { material = false }, but it shouldn't be necessary.
  • It isn't easy to create a MeshShape or ConvexShape from a "piece" of a model. There isn't an existing Model/ModelData API that can give you the triangles of a single node or mesh, and even if you get the full vertex/index buffer, you have to get the mesh draw range (or ranges, for a node) and extract that subset of vertices yourself.
    • There are lots of ways to solve this. MeshShape constructor could take a Model plus a node/mesh, or a vertex range, but these feel somehow wrong. Alternatively there could be a way to query different sets of triangles from the Model.
    • Also, Model:getMesh doesn't work because the Mesh itself references the model's entire vertex buffer and uses the draw range to reference a slice from that buffer, which newMeshShape isn't able to understand.
  • Relatedly, it's also hard to just get the vertices from a Model. ModelData's API to return vertices is able to support arbitrary vertex formats, which makes it really difficult to "just get the UVs from the mesh" or whatever. You have to search through all the attributes, find the UV attribute, and then hope it's in a format that works for you.
    • This is because glTF supports arbitrary vertex formats. This might be overkill, I've been thinking LÖVR's ModelData object could convert all the vertices into a standard format on import (Model already does this and no one's complained, maybe we can just move that logic to ModelData).
  • We currently do not support what glTF calls "instantiated meshes" (animated meshes referenced by multiple nodes, each potentially with a different skin or set of blend shape weights). It would be nice to support this.
  • As discussed in Multiple Animated Poses Per Model #701, models can only have a single animated state right now. Model:clone solves this for now, but we should keep this in mind while redesigning Model-related APIs.

Here are some ideas to improve the API and solve some of these problems:

Improved Node Walking API

  • Model:getNodeChildren is bad because it creates garbage. It also requires recursion to iterate?
  • Consider the following alternative APIs:
    • Model(Data):getNodeChild --> returns index of a node's first child
    • Model(Data):getNodeSibling --> returns index of a node's next sibling
    • for node in Model(Data):nodes([mode]) --> returns a Lua iterator that iterates over all node indices
      • mode could let you iterate in DFS/BFS order.
      • There could be other parameters like "only visible nodes" or "root node to start at" or "recurse".
      • I think this could be a stateless iterator to avoid garbage.

This would let you do something like the following to walk over nodes:

for node in model:nodes() do
  for mesh in model:meshes(node) do -- assume some way to get meshes for node
    pass:draw(mesh, model:getNodeTransform(node))
  end
end

This makes it a lot easier to walk the node graph.

Improved Mesh API

  • Suggestion: Convert Model vertices immediately when importing, in ModelData, rather than in Model.
    • LÖVR will have a standard vertex format for model vertices.
    • ModelData accessors can be simplified, and return data in a known format.
    • There could be a function like ModelData:encode that returns a Blob with LÖVR-optimized binary model data, which would make models very quick to load.
  • Keep Model:getMesh. Maybe it could use something other than draw range to refer to a subsection of the Model's vertex buffer, not sure yet.
  • Make Model(Data):getTriangles more flexible:
    • You should be able to pass in a mesh index and get local vertices for just that mesh.
    • There is a notion of "local vertices" vs. "full vertices" (name TBD):
      • local vertices are the raw contents of the vertex buffer. The vertices are not duplicated for every node that references them, and they are not transformed by their node transform(s). This is basically the contents of the Model's vertex buffer, or what you'd want if you're creating a MeshCollider from a specific node in the model.
      • "full" vertices are the full set of vertices in the model: they are duplicated and transformed by each node.
  • Replace ModelData's :getMeshVertex and :getMeshIndex with :getVertices and :getIndices (or :getVertex / :getIndex) methods.
    • These should have the same flexibility mentioned above for :getTriangles, so you can get the vertices for the entire model, or just a single mesh.
    • Additionally, they'll return data in a single, known format.
  • Replace ModelData's :getMeshDraWMode and :getMeshMaterial with ModelData:getMesh
    • Consider mode, material, start, count, base = ModelData:getMesh(i)
  • Add for i, mesh in Model:meshes(node) iterator that lets you quickly iterate over Mesh objects that belong to a node.
    • Unclear if this one can be stateless.

All of this should make it easier to grab vertices/indices/triangles out of a model.

First Class Vertex Animation

Ok this is a little more out there, but it would allow you to render animated models manually:

  • Meshes can have skinning info (joint indices and joint weights). Maybe just specially-named vertex attributes.
  • Meshes can have blend shapes (a blend shape is a position/normal/tangent displacement for each vertex)
    • You declare blend shape count when creating mesh, and the mesh creates an extra buffer to hold all the blend shape values.
  • Add an Animator object
    • An animator manages the state for an animated mesh
    • It contains a list of joint transforms and a list of blend shape weights.
    • So you can do Animator:setJointPose(i, ...pose) and Animator:setBlendShapeWeight('smile', 1)
    • The Animator also holds the vertex buffer for the animated vertices.
    • You can attach an animator to a mesh: Mesh:setAnimator(animator), kinda like a material.
    • When you draw a mesh, if it has an animator attached then it will use the animator's joint poses and blend shape weights during rendering.
    • You could draw the same mesh with different animators to get multiple different animated poses, without having to duplicate the mesh or clone it or anything like that.
  • Model is going to be creating Animator objects for each node with an animated mesh.
    • This allows us to support mesh instantiation. Nodes can reference the same Mesh, but they'll have different Animator objects to hold a unique animated state for their mesh.
  • Methods like Model:animate / Model:setBlendShapeWeight will basically call down into all the different Animator objects that are affected by the animation.
    • Also you can think of Model:clone as creating a new Model with its own set of node transforms and animators.
  • The model:meshes(node) iterator will return mesh, animator pairs.

So to render a model manually, you could do the following:

for node in model:nodes() do
  for mesh, animator in model:meshes(node) do
    mesh:setAnimator(animator)
    pass:draw(mesh, model:getNodeTransform(node))
  end
end

Which is pretty nice!

It also makes it possible for you to reuse LÖVR's animation machinery to give your own Mesh objects blend shapes or skeletal animations. I'm not sure if anyone would end up doing that though...

However, I'm not totally sold on Animator. I'm not sure if the extra complexity on Mesh would be worth it. Maybe it's easy to ignore the blend shape stuff for non-animated meshes and it would be fine.

@DonaldHays
Copy link

I only briefly skimmed this issue, but the parts about materials sounded relevant to this issue in my shading library: DonaldHays/lovr-base-shading#5

I have a BaseMaterial class for the library, and I haven't tackled making it play nice with Lovr Models, but it doesn't immediately seem the most intuitive fit. I had been figuring I just needed to take a closer look at how Lovr handles such things, but it might be a use case worth considering if revisions are being considered.

This might be overkill, I've been thinking LÖVR's ModelData object could convert all the vertices into a standard format on import (Model already does this and no one's complained, maybe we can just move that logic to ModelData).

One project I want to tackle at some point is models that have pre-calculated lightmaps, like Quake levels do. Such models need to have two sets of UV coordinates: one for the primary material, and another for the lightmap coordinates. I've looked a little at Lovr's Models before, and haven't been sure how I would tackle it, since I've only seen built-in support for a single UV attribute. Would this proposed standard vertex format support custom attributes in addition to the standard defined attributes?

It also makes it possible for you to reuse LÖVR's animation machinery to give your own Mesh objects blend shapes or skeletal animations. I'm not sure if anyone would end up doing that though...

I'm not sure if this section is talking about passing custom animations on existing blend shapes, or defining new blend shapes on raw vertex data. But for custom animations on models with existing blend shapes and armatures, I would be interested in being able to do IK on armatures, and manipulating blend shapes for things like character customization, or lip-synched voice lines.

@bjornbytes
Copy link
Owner Author

materials

There are things I want to do to make Materials less opinionated and useful, but I'm mentally considering this a different topic since it affects more than just Models. They aren't very convenient to work with for any non-model use cases, and they are hard to extend. (And, they only really exist in the first place to plumb through model material data to shaders, they aren't really intended to be used as a tool to build your own material stuff on top of...kind of).

The vague idea I have to improve Materials is to remove them entirely and make it so any material properties in the model automatically get sent to any uniforms in the shader with name-based matching. So if you want to render stuff with normal mapping, just declare NormalTexture variable in your shader and then any models you render will automatically send their normal map to that shader variable. The material properties would also be accessible as a key-value dictionary so that you can wire them up to other inputs. There were some reasons that this was difficult/impossible in the earlier days of Vulkan, but now I think it would work.

lightmaps

Just tried adding support for lightmapped models in #860.

We don't currently support importing custom vertex attributes from the glTF. Changing ModelData to immediately convert to a standard vertex format makes it more difficult to add support for that.

custom animations

That section is talking about being able to add blendshapes and skeletal animation to a Mesh object. This would be done more as a way of making animation a "first class feature" so that you can render bits of a model manually. Theoretically you could hook into it if you wanted to import custom blend shapes from your own file format, which is interesting, but I feel like it's niche.

You can already define your own animations (procedural or keyframed) and do IK for that matter by moving nodes around with Model:setNodeTransform and similar methods. I don't think IK will ever be built in because there are many different IK algorithms and so many ways to tweak it. So I don't think LÖVR could have a built in IK solution that satisfies everyone. IMO it's best left to Lua libraries that move the nodes around (that way you can also use it with non-Model rigs too).

@bjornbytes
Copy link
Owner Author

Another thing from chat: it would be nice if models respected the filtering info from the glTF (nearest, wrap vs. repeat). I think some other sampler-related reworking has to happen first though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants