How To 3D Print High Polygon Models with Shapeways

AN IMPORTANT UPDATE: As of April, 2013, Shapeways no longer offers this high polygon workaround. If your model has over a million polygons, use this handy tutorial to reduce them so you can upload your model in the normal way. Thank you for your understanding!

Ok Zbrush users, organic algorithm artists and character modelers, Shapeways have developed a workaround so that your models can now exceed the 1,000,000 polygon limit currently in place. 

It is not elegant, and takes some negotiating, but if we get enough demand we will try and automate the process to make it easier for everyone.  Let us know what you think, is this something you would like to see as standard? 

Order Process for High Polygon Models:

  1. Send your zipped High Polygon file to Shapeways service(at)shapeways.com
  2. Shapeways will manually check the file and inform you if it can be printed,
  3. Once you have confirmation from Shapeways that it can be printed you will need to upload a dummy file of a cube with the same volume as your high polygon model to your Shapeways account with the file name ‘POLYGONENFILE‘, order and pay for it.
  4. You then need to email your order number to service(at)shapeways.com.
  5. Your High Polygon model will then be 3D printed and delivered to your door.

We warned you it was not pretty, but if there is enough uptake we can automate and speed up the process, until then these manual steps mean there will be an increase to the time it takes to process the model Update: the volume of the cube needs to be the same actual material volume, no bounding box. The price is the same as for a “low” poly model.

Once the resolution of the model is increased, Shapeways can then work on increasing the resolution of the 3D printing processes to make the most of the high polygon models.  

You might wonder why we have not fixed this problem already? Well
processing those big polycount files takes a lot of memory and we
process multiple files at the same time. We can improve our algorithms
(rocket-scientists are welcome to help) or add memory. Currently we
already have around 8GB of processing memory, so this is also not that easy.

Now let’s see your high polygon models….

15 comments

  1. Tommy Strömgren

    Hi!

    I haven’t “hit the ceiling” yet since my models are very easily optimized in MeshLab (that quadric edge collapse-thingy with planar simplification on) so I’m not in the target group for this proposed solution… but I just thought of an alternative idea…

    I’m guessing you have several servers clustered to do the automatic verification and my idea is based on that assumption.
    If that is the case, why not create a separate “pipeline” for high polycount files?

    If the polycount is below 500K; queue it for verification as normal.

    If the polycount is above 500K; put it in a separate “high polycount” queue with a dedicated server that does less parallell tasks, e.g. 1 or 2 highpolycount models in parallell.

    That way, you would only have to inform the uploading user that if the polycount is above the threshold it may take longer to verify since those files are handled in a separate queue.

    I always thought that it was the print job preparation stage that imposed these memory issues.
    Sure, the automated verification takes longer on a larger file.. but I really thought they were problematic to handle when you had several objects at the same time (prep’ing a printjob).

    1. Peter Weijmarshausen

      Hi Tommy,

      good idea and we thought of it also and we might still do it. However in reducing polygons we might also damage the model. So we need to give detailed visual feedback to prevent disappointment. Also it is more work than the current bare bones process, so let’s start like this.

      Thanks for sharing!

      Peter

    2. csrdfr

      Build prep would be the main reason. Most Z-Brush users can relate to multi-million triangle models slowing their system down to a crawl. Now take that model, duplicate it 50 times and try and use the program to array the models manually.

      Its a god damn nightmare.

  2. Duann

    Thanks Tommy,

    I will pass your comment on.

  3. chris bense

    This would be great! a lot of errors are caused by crunching down the polycount of the model, this would save so much time and headaches in the end.
    Questions that arise:

    Will the price be different?

    Does the cube need to be the same actual material volume, or will it need to be the outer bounds of the model?

    thanks

    1. Peter Weijmarshausen

      Hi Chris,

      the volume of the cube needs to be the same actual material volume, no bounding box.
      The price is the same as for a “low” poly model.

      Peter

    2. chris bense

      That is perfect, thanks for the answer.

  4. Lincoln Kamm

    I have many models that I would upload if we could upload over 500,000 poly models. The method mentioned above seems a bit to much work for right now. So, if there were a more strait forward way to upload them, I’d be interested.

    1. Peter Weijmarshausen

      Hi Lincoln,

      what kind of models are they? And how many polygons would be sufficient for you to upload your files? 1M? 10M?

      Peter

  5. Glenn Slingsby

    Right now I don’t see the need to upload a large polygon model since with your current printing resolution is there any point? (Yes, that’s a question, not a statement). I use ZBrush and just decimate the polygons to make the file upload quicker… Certainly if you introduce a jeweller’s wax material then this would be a different story.

    Glenn

  6. Ben Calvert-Lee

    In my experience, anything in excess of 500,000 polygons is a waste. The printing technology is likely to be unable to capture details quite that fine. I frequently work with Zbrush files consisting of anything up to about 12 million polygons but have not had any trouble using Decimation Master to reduce this to under the 500K limit.
    I would worry that removing the cap would pave the way for lazy artists not optimising and therefore greatly increasing the processing time for a mesh when it is not necessary.
    If a super-high detail material is introduced, I can see argument for the cap being raised.

    1. Peter Weijmarshausen

      Ben, you are mostly right, but in some cases there are even with todays resolution valid reasons to go beyond 500k polygons. Remember our post on Digi-Fabrics? Those had around 500k with a small model, so in that case you could definitely go beyond.
      Perhaps 12M is a bit rich, but it depends a little on the geometries.

      Peter

    2. Ben Calvert-Lee

      Having just checked the Digi-Fabrics post, I can see how the polys could start to add up. Perhaps polygon-density is the more relevant aspect?
      I should add that’s 12M spread across multiple sub-tools, not simultaneously active :)

    3. Lukas

      Prints of big molecules need big numbers of triangles too
      as anything does wich a lot of highly curved surface.

  7. Reavenk

    I think what would also be way cool is if users could upload the cage mesh and it would print out the limit surface for you without users having to subdivide before uploading. Of course, STL doesn’t support quads, but if there was a way to do that, I see it helping a lot of people who use a subdivision workflow.

Comments are closed.