Jump to content
Sign in to follow this  
Leviathan Flux

Error: Vertex number is more than 65534, aborted! when model is actually much less verts (Maya 2014)

You are about to reply to a thread that has been inactive for 1810 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

I am stumped with this one.  All of a sudden, I have been unable to upload a number of models and get the message "Error: Vertex number is more than 65534, Aborted!" but the verts is actually less than half of that.  I can upload some other models with more verts AND I was able to upload the same ones that are giving me errors now a few days ago, and I can still upload different sizes of the same model (same vert count) that I exported from Maya before.  Anyone else experiencing this?

I am using Maya 2014, exporting directly as collada (dae) and uploading with the standard SL viewer.  I also tried with Firestorm and was unable to upload as it just froze.

I know there are some factors that can increas the vert count in SL compared to how it's represented in Maya, but these models uploaded fine before so I dont see what could be different.  I even tried loading an older version from a backup saved two weeks ago when it was working and had the same problem if I export them now and upload. I also tried reinstalling SL and reinstalling Maya.  

Share this post


Link to post
Share on other sites

I didn't answer this because I don't know where this error message gets triggered. In the code there is supposed to be a limit of 65536 vertices, but the trap fot that should never trigger because another limit, on the triangle count per material, always gets triggered first, as far as I know. The latter error doesn't stop the upload. Instead it just secretly starts new materials. I just tried uploading a mesh with more than 100,000 vertices with the current LL release viewer and there was no problem. So I suppose it has to be something specific with your mesh.

You say you tried uploading old versions of the mesh that worked before, but did you avoid the slm file mechanism? This is the system that keeps settings for the last time a file with the same name was used. If something happened with those settings and the new file, it would the affect the old file too. You can stop the slm thing by (a) deleting the .slm file from the directory where your mesh file is, or (b) clicking the reset button on the uploader dialig, or permanently © by disbl;ing the relevant debug setting ( I think it's MeshImportUseSLM). I have it turned off permanently because it can cause several different problems.

If that doesn't change anything, can you say exactly when in the sequence of actions the error message appears. Also, what you did with the physics tab (if anything) and LOD slots, and whether the mesh was rigged (in which case I won't be able to help unless it also happens when uploaded as a static object). Also, what were the vertex and triangle counts displayed in the uploader (if it got that far).

Share this post


Link to post
Share on other sites

Thanks for the response, Drongle.  I am pretty sure it's not the slm as I have tried resaving with different names in different folders, and I deactivated the MeshImportUseSLM as well and still have the same issue.  

The model is 36777 verts, and 73234 tris (I know, very high)

I had uploaded it while rigged a week ago, and tried to re-upload a few days ago after making a few adjustments to the skin weights and got the error.  I then tried to upload as a static object with the same problem.  The error message pops up immediately so I cannot make changes to the LODs, physics, rigging, or try to calcuate upload fee or see exactly how may verts SL is calculating it has.  If I delete some of the faces on the mesh until it's under about 25k verts, I can upload it and SL gives an accurate vert count (adds about 1k to the count)

I tried reinstalling Maya and SL, and tried to install the OpenCollada plugin but get the same result.  I was also able to import the model into Blender and export from there and upload to SL, so it seems something going on in Maya, but odd because I can export and upload other models. .  I reset the export settings in case I had accidentally ticked something, but nothing seems to be working.  

This is making my brain hurt :(

Share this post


Link to post
Share on other sites

If you mean you did did ...

Maya->Collada->SL ERROR

Maya->Collada->Blender->Collada->SL OK

That is strange indeed. The files will be far too big to look for anything by hand. The error message you see seems to come from an XML file. I don't know how that gets displayed. So I can't look for where it's triggered in the code. There are error messages embedded in the code that are easier to locate. I think they (or at least some) come out in the log file if you have the right logging option. Before you log in (LL viewer) you can set that in the menu Debug->Set Logging Level->Error. I think that might be enough. Then if you login, immediately start the upload, logout as soon as the error appears, you might find an error message in the log file which might tell us a bit more. In particular, if it comes from the code, it might be possible to see where it's triggered at the lowest level. Of course that might not reveal what the prblem is in the collada file.

When you deleted some of the faces, do we know whether it was just the number left, or whether something nasty got deleted by chance. Does it still work if you delete a completely different set of facs?

 

 

Share this post


Link to post
Share on other sites

Ah. I might have seen how this can happen. The code looks at each polygon and triangulates it before checking the numbers, fisrt the number of vertices (>=65535 gives error), then the number of indices in the triangle table. It does these after each polygon. There are always three indices per triangle, and the check is whether its >=65532. There can never be more vertices than indices in the triangle table. So as long as the input is all triangles, that is to say all polygons have only three vertices, the first check can never fail before the second one. The second check does not produce an error. Instead it starts a new material (prim face in SL) which resets the counts to zero. So triangulated meshes with more than 65535 vertices can always be uploaded. However, if there are ngons (even possibly just quads) then a polygon loads more than one triangle between each series of checks. Then the number of vertices can exceed 65534 before the checks are run. In that case the first check will produce the error. Generally, this will only happen if the mesh is flat shaded without any coplanar adjacent faces (and/or the UV map is totally fragmented*), because otherwise many of the indices in the triangle table will point to already stored vertices and the vertex count will increase slower than the triangle index count.

So can this explain why going via Blender allows the mesh to be uploaded? Yes! Firstly, the default setting (for SL upload preset) is for the exporter to triangulate the mesh. If that's active, we are back to triangles only, and the first check should never be triggered. Even if it isn't active, so there are still ngons and/or quads, the order of polys will be different. That could change things so that the second check is triggered exactly at the end of a ploygon, allowing the upload to proceed.

This presents a possible solution - just triangulate the mesh before you export it from Maya. If I am right, it will then upload. It's better to triangulate before uploading really anyway, because that gives you more control than leaving it to the uploader. The details of the triangulation can affect shading if polys are non-planar. It's going to get triangulated by the uploader if you don't do it anyway, because SL only uses triangles. Just make sure you can undo the triangulation after exporting, or save an untriangulated copy, because editing triangulated meshes is horrible.

Incidentally, this thing with the vertex count limit being pre-empted by the triangle index count check is the subject of a very old bug report in the jira (BUG-1001). Maybe I need to add to it with this new facet of unexpected behaviour. Let me know if the triangulation works.

*Note - this can also happen if there is no UV map at all, because the uploader then uses uninitialised data for the UV map!

Share this post


Link to post
Share on other sites

I tried to clean up the model, triangulate, tried different options with the dae export options, tried exporting and reimporting as obj, tried every possibility I could for days, and I tried uploading different sections of the model without any success in identifying a piece that had an issue.  With the help of a friend, we were able to determine that I could upload the same model I was having trouble with on Singularity.  The most logical explanation to me right now is that it must have been some sort of update or glitch with SL viewer because even the mdoesl that were working a week ago were not working on it now.   There did also happen to be something about the model I was working with that was not working with the SL viewer, and still not sure what that was.  Thanks for the help.  I am not sure if there is a solution to this, but maybe we will see it occur with others and will know for sure if it's a new glitch with the SL viewer.  

Share this post


Link to post
Share on other sites

Is that the vertex number reported by Maya or Second Life? Not 100% sure about Maya but in Max the actual exported vertex count isn't the same as the one displayed within Max. vertices along hard edges get split when you export. Unless you have everything in a single smoothing group, you are likely to see an increase of real vertices in SL versus what the modelling software shows.

Share this post


Link to post
Share on other sites

Oh well, not that then. I did test my theory though:

Made a mesh with stacks of 12-vertex ngon disks (10 tris each), displaced to make flat shaded normals different for adjacent triangles. Triangulated, it uploaded fine with 65550 vertices. Without triangulating it failed, as predicted. It never got as far as showing a vertex count, as with your mesh, but the error message wasn't the one you got. It just gave the red cross and "Missing level of detail". Taking off two of the disks, dropping the vertex count to 65530, it uploaded fine without triangulation, although I had to use generate normals and set crease angle to 0 to see the full vertex count - otherwise it treats the triangulated ngons as smooth shaded, I guess. That last observation raises possibility that my explabation isn't completely accurate, but the main prediction is right - with untriangulated ngons it is possible to be hit by the 65534 vertex limit, but triangulation before upload can overcome it. I guess there must still be something else in your case if the triangulation doesn't work. Since the error message I got was different, I guess that also says it's something else.

 

Share this post


Link to post
Share on other sites
You are about to reply to a thread that has been inactive for 1810 days.

Please take a moment to consider if this thread is worth bumping.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...