The very concept of UV unwrapping disgusts me. You're telling me, that the topology I meticulously worked on, now has to be split and splayed out on a 2D plane, where it only kinda-sorta retains its scaling?Now I'm looking at a flayed human body, and I have to somehow paint its warped shape in a way that avoids unsightly seams, and screen space edges.Then, if I realize I need to change the model's topology or the position of the UVs, then it becomes unaligned. So basically, the mesh has to be set in stone before I even begin working on UVs.Why has nobody come up with a better solution already? No 3D texture space? No way to color vertices on a conceptually higher density than the actual mesh? That would probably be easiest. Just treat it like subdivision, but don't actually subdivide the mesh. Only increase the amount of imaginary points one is capable of coloring.The thing that really annoys me, is that UV unwrapping seems to be the best that we have.
>>1027767There are plenty of other methods that are simpler for authoring, but as far as texture memory use and access goes none of them are more efficient for polygonal models than UV unwrapping. Doubly so given that every GPU produced in the last couple decades is hpyer-optimzed for that use case. So sooner or later you'll have to deal with it
>>1027769Maybe I can texture every face individually. Has anyone ever tried that? No, not unwrapping the mesh into squares. Save me THAT talk. Though, what I'm suggesting is similar. Except that it eliminates the need to actually unwrap. Rather, it just maps all the faces to the texture space, and creates an atlas of sorts to keep them separated and organized. And then you could even set the density of every face in order to save memory. Because larger faces would need higher density than smaller faces. Hmmm... Maybe this is possible with geometry nodes?
>>1027772Disney did it, they called it PTEX. It is only suitable for offline rendering as you can't mipmap a jumble of discontinuous faces.
>>1027767You assign the different materials to the different colors you want.Then you can do a smart UV unwrap with a slight separation.You bake the materials into a single one.TLDR learn to bake different materials into one.
>>1027777Sorry for the beginning, I meant that you select the faces, and assign a material to them.You can have different materials assigned to different faces with different colors.There's also marking the seams. That's the one that I haven't tried it as much. So I know what it does, but not how to do it as efficiently.
>>1027772As >>1027774 says, PTEX is exactly that, but it assumes 4-sided faces (because a 2d texture has to be a rectangle) so if you've got triangles you're out. The ptex answer to that is literally "disney pipeline is all quads anyway, or subdiv everything at least one step to get all quads" but that flies for offline rendering, not in real time where you can't afford to just triple or quadruple your poly count and also have to avoid very small faces for better perf
>>1027767I understand your sentiment, but it is what it is. Protip: don't even try to get around the issue. The best and shortest way to deal with unwrapping is to learn how to do properly.
>>1027786>>1027774I made a rough concept. Using geometry nodes, I packed all the faces of this model into the UV space as squares. Then, using a single 4k texture, I'm capable of drawing on the mesh with higher density. Demonstrated in the video, is 16 pixels per face. I can adjust it higher or lower. But the major problem is that the brush bleeds into other faces seemingly at random. There's some kind of pattern to it, though I can't figure it out. I tried scaling each face down individually, giving them a buffer zone between each other. But that's doesn't solve the problem.It's unusable in its current sketchy form. But maybe it can be made to work somehow.
>>1027801For the brush to not bleed those need to all be individual textures or have a safety margin between each. And it only looks plausible because you're on point filtering, when you switch to bilinear it'll become apparent your method doesn't filter across faces, as was mentioned by >>1027774, and also due to mipmapping everything's gonna start to look weird when the texture is minified.
>>1027767Maybe there's undiscovered yet vertex colors algorithm that asks you to store not one but multiple colors per vertex and then uses that to provide more coloring resolution for imaginary vertices as if they were if you used subdiv.
>>1027815>For the brush to not bleed those need to all be individual texturesI would like to try that. However, I think that's outside of what geometry nodes are capable of. Because there's no way of creating a mass batch of textures. Perhaps there is some python solution for that task, but I don't know python. So instead, I'm opting to pack all the faces into a space where one big single texture can be used.>or have a safety margin between eachThat's what I said I did before. But going back over things, I see where I made a small mistake and just fixed the scaler. Now it doesn't bleed anymore.>when you switch to bilinear it'll become apparent your method doesn't filter across facesSure, however, that's not a problem when painting on the model. When you paint on the model itself, the brush is already doing all the blending across faces. You won't be able to tell that faces are disconnected. This would only be a problem for those painting on the 2D texture.If, for whatever reason, the user wanted to apply some kind of blending filter to the 2D texture, it's still possible: Because every face is a square, you can easily find the coordinates of neighboring faces. In 3D space, every face has a clockwise and counterclockwise orientation. If you choose a corner at random, and then move along the edge to the next corner, that defines a direction. We can call the first clockwise move "north". And then from there, we can define east, south and west. Now we can capture the index of neighboring faces by following north, east, south, and west coordinates.I don't really know what mipmaps are. But since textures are a power of 2, can't you just create lower resolution versions by going down a power? I don't see why that would be a challenge.
>>1027817That's an interesting idea.
>>1027842>for whatever reason, the user wanted to apply some kind of blending filter to the 2D textureWhen we're talking about filtering we're not talking about applying filters but about the sampling process that makes textures continuous in the eyes of the GPU and is a basic requirements of texture systems if you wanna use anything other than point sampling.For example, for bilinear that safety margin needs to be across all mipmaps. So be 1px on mipmap level 3 it needs to have been 2px in mipmap level 2 and 8px in level 1. That means each bitmap is no longer half the size of the preceding one in each dimension and you can't use hardware mipmapping, you have to write a custom solution. Also this is all ignoring the elephant in the room, which is that in sucj a scheme all the faces have the same level of texture details, which is a waste of memory for smaller faces.Again, what you're trying to do is ptex and it was already done better by Disney research, and then everyone switched to udim which does use uv maps because uv maps allow for much better detail control