-
-
Notifications
You must be signed in to change notification settings - Fork 146
Better Model API #858
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I only briefly skimmed this issue, but the parts about materials sounded relevant to this issue in my shading library: DonaldHays/lovr-base-shading#5 I have a
One project I want to tackle at some point is models that have pre-calculated lightmaps, like Quake levels do. Such models need to have two sets of UV coordinates: one for the primary material, and another for the lightmap coordinates. I've looked a little at Lovr's Models before, and haven't been sure how I would tackle it, since I've only seen built-in support for a single UV attribute. Would this proposed standard vertex format support custom attributes in addition to the standard defined attributes?
I'm not sure if this section is talking about passing custom animations on existing blend shapes, or defining new blend shapes on raw vertex data. But for custom animations on models with existing blend shapes and armatures, I would be interested in being able to do IK on armatures, and manipulating blend shapes for things like character customization, or lip-synched voice lines. |
There are things I want to do to make Materials less opinionated and useful, but I'm mentally considering this a different topic since it affects more than just Models. They aren't very convenient to work with for any non-model use cases, and they are hard to extend. (And, they only really exist in the first place to plumb through model material data to shaders, they aren't really intended to be used as a tool to build your own material stuff on top of...kind of). The vague idea I have to improve Materials is to remove them entirely and make it so any material properties in the model automatically get sent to any uniforms in the shader with name-based matching. So if you want to render stuff with normal mapping, just declare
Just tried adding support for lightmapped models in #860. We don't currently support importing custom vertex attributes from the glTF. Changing ModelData to immediately convert to a standard vertex format makes it more difficult to add support for that.
That section is talking about being able to add blendshapes and skeletal animation to a You can already define your own animations (procedural or keyframed) and do IK for that matter by moving nodes around with |
Another thing from chat: it would be nice if models respected the filtering info from the glTF (nearest, wrap vs. repeat). I think some other sampler-related reworking has to happen first though. |
There are some limitations of the Model API that would be nice to improve:
ModelData:getNodeMeshes(node)
to see which meshes to render. There are some downsides:Pass:setMaterial(texture)
, but the model doesn't use the pass's material -- it uses its own material (which is usually just plain white). There is a workaround where you can load the model with{ material = false }
, but it shouldn't be necessary.Model:getMesh
doesn't work because the Mesh itself references the model's entire vertex buffer and uses the draw range to reference a slice from that buffer, which newMeshShape isn't able to understand.ModelData
's API to return vertices is able to support arbitrary vertex formats, which makes it really difficult to "just get the UVs from the mesh" or whatever. You have to search through all the attributes, find the UV attribute, and then hope it's in a format that works for you.Model:clone
solves this for now, but we should keep this in mind while redesigning Model-related APIs.Here are some ideas to improve the API and solve some of these problems:
Improved Node Walking API
Model:getNodeChildren
is bad because it creates garbage. It also requires recursion to iterate?Model(Data):getNodeChild
--> returns index of a node's first childModel(Data):getNodeSibling
--> returns index of a node's next siblingfor node in Model(Data):nodes([mode])
--> returns a Lua iterator that iterates over all node indicesmode
could let you iterate in DFS/BFS order.This would let you do something like the following to walk over nodes:
This makes it a lot easier to walk the node graph.
Improved Mesh API
ModelData:encode
that returns a Blob with LÖVR-optimized binary model data, which would make models very quick to load.Model:getMesh
. Maybe it could use something other than draw range to refer to a subsection of the Model's vertex buffer, not sure yet.Model(Data):getTriangles
more flexible::getMeshVertex
and:getMeshIndex
with:getVertices
and:getIndices
(or:getVertex
/:getIndex
) methods.:getTriangles
, so you can get the vertices for the entire model, or just a single mesh.:getMeshDraWMode
and:getMeshMaterial
withModelData:getMesh
mode, material, start, count, base = ModelData:getMesh(i)
for i, mesh in Model:meshes(node)
iterator that lets you quickly iterate over Mesh objects that belong to a node.All of this should make it easier to grab vertices/indices/triangles out of a model.
First Class Vertex Animation
Ok this is a little more out there, but it would allow you to render animated models manually:
Animator:setJointPose(i, ...pose)
andAnimator:setBlendShapeWeight('smile', 1)
Mesh:setAnimator(animator)
, kinda like a material.Model:animate
/Model:setBlendShapeWeight
will basically call down into all the different Animator objects that are affected by the animation.Model:clone
as creating a new Model with its own set of node transforms and animators.model:meshes(node)
iterator will returnmesh, animator
pairs.So to render a model manually, you could do the following:
Which is pretty nice!
It also makes it possible for you to reuse LÖVR's animation machinery to give your own
Mesh
objects blend shapes or skeletal animations. I'm not sure if anyone would end up doing that though...However, I'm not totally sold on
Animator
. I'm not sure if the extra complexity onMesh
would be worth it. Maybe it's easy to ignore the blend shape stuff for non-animated meshes and it would be fine.The text was updated successfully, but these errors were encountered: