Very Large File, Irregular Mesh Object (15.7 Mln Vertices)

Discussion in 'General Discussion' started by runcyclexcski, Jun 22, 2020.

  1. runcyclexcski
    runcyclexcski Active Member
    I've got a molecular model which, when exported into WRL and opened in Meshlab, shows 15.7 mln vertices and 5.2 mln faces. The WRL is 311 Megs; the STL is 1.5 Gigs. I've printed smaller molecular models before, but this one is a rather large, irregular molecule. Zoom in >>> mesh looks just about acceptable, would not want to simplify, as this is for student teaching. The target size I have in mind for this is about 4x4x4 inches.

    Any ideas? Slice this up into 3x3x3 (27 cubes), print separately, glue? Is there a tool to optimize slice-placement (i.e. to avoid orphan non-connected blobs).

    SW strong plastic, or the semi-transparent one.

    Cheers!

    P.S. I tried to post a screenshot of my meshlab view, but there seems to be a bug in the image-placement in Crome. May be this is b.c. Meshlab has eaten the memory up.

    P.P.S. The input data for the surface are atomic coordinates, i.e. a bunch of spheres, dots, and lines in XYZ; that file is 'only' 30 Mb. Not sure if this helps, but may be SW printers can interpret non-mesh formats?
     
  2. Shea_Design
    Shea_Design Well-Known Member
    I looked at something like this a few years back and the spheres were chewing up the poly budget. If you can reduce the mesh density of the spheres you may be on your way to optimization. I was able to do this when I imported the SW's user's WRL into 3ds max but got tired of the heavy scene and went back to work. The atomic coordinates are essentially a scene graph describing the geometry, far different that parsing the dataset into triangles. Your vertex to face ratio is pretty close to 3X, so it seems like meshlab has done a decent job welding the verts. Have you tried decimating the model? There are some very good algorithms out there, curve based, max/min edge length etc.
     
  3. runcyclexcski
    runcyclexcski Active Member
    I did try decimating simpler models using quadratic edge and isotropic. I could get a 5-10x reduction, although the molecules started looking 'pointy' where spheres used to be (as expected). For this one, at 10x reduction, I still would be at 1.5 mln vertices. Thus, I was wondering if there is a more general approach to this (hence the idea with splitting into cubes).

    There are large macromolecular models comprised of many 'subunits' (50-100) which fit together neartly, and are more or less independent. So, these could be printed separately in a rubbery material and reassembled by hand (which in itself is an interesting project for students -- how proteins interact etc). But the one I am looking at right now is one long interwoven worm, no obvious way to split it into independent parts, other than brute slicing
     
  4. Shea_Design
    Shea_Design Well-Known Member
    Or brute booleans with tongue and groove or key ways. ZBrush would make cake of it. -S
     
    runcyclexcski likes this.
  5. runcyclexcski
    runcyclexcski Active Member
    well, I will need to learn what all of these options mean first! :) OK, looks like Zbrush imports STL, that's a start. Downloading 1.2 Gb of the Zbrush trial version, will try tomorrow! If it works, I may need to build a new PC for this.
     
    Last edited: Jun 23, 2020
  6. Shea_Design
    Shea_Design Well-Known Member
    The live boolean system in ZB is amazing, so much so that you can add mortise and tenon, (stick and hole) to provide assembly features. The decimation in ZB is great too.
     
  7. runcyclexcski
    runcyclexcski Active Member
    I downloaded, installed it, and opened the Obj surface file (1.5 G) generated by my molecular modeller. It opened fine (while Mesh lab died at this point). and the Z-file is pretty small (like 10x smaller). But usability-wise ZB will take me a while to do even most basic things. Definitely not a piece of software in which I can just open a file, run a filter, zoom to inspect the result, and re-save (Mesh lab was like this, i.e. pretty intuitive). And I thought Autodesk Inventor was awkward :) But I see why artists use ZB.
     
  8. hanelyp
    hanelyp Well-Known Member
    What software is generating the original mesh? Are there mesh generation settings that might improve facet efficiency? A shot of what the raw mesh looks like, even on a simplified model, might help us figure how to improve the mesh.
     
  9. runcyclexcski
    runcyclexcski Active Member
    Here is a relatively high-res stl file at 4 Mb at mild decimation level (the original is 14 Mb). This is a piece of DNA that is 50 units (base-pairs) long. The structure in the OP is 8000 units long (base-pairs), i.e. the respective file and stuictucture scales up 160x. The bumps etc are important, and I would not want them to be converted into spikes (which is what aggresive decimation does). I imagine, if bumps were approximated as spheres, gaussians or sine functions, it would be acceptable, but not sure if printers can interpret these?
     

    Attached Files:

  10. McFly
    McFly Member
    With ZBrush it would be no problem to bring the point / poly count down, with minimum detail loss, that you don't see in 3D print