New 3D scanning web app…
This week we’re looking at a new 3d scanning web app that scans real objects and you can digitise them for 3d print output on your Personal Factory here at Ponoko!
my3dscanner.com launched recently and is rapidly becoming popular. It allows anyone who owns a digital camera that records EXIF data to 3D scan real life objects! (If you don’t know what EXIF data is, don’t worry – as long as you have a camera newer than 1998 it probabley does this.)
How to scan
The extent of this article is simply to cover the actual scanning process. Converting the generated point cloud to a mesh is a subject for another article. (In the meantime I suggest you check out some tutorials online. (Meshlab has some available.)
Scanning often requires a tidy up, post scan – so you´re best avoiding geometric primitives that could be more efficiently modelled by another means.
Secondly your subject’s surface finish/es need consideration – even $10,000+ 3d scanners struggle with highly reflective surfaces made of metal, glass, high gloss plastics, etc…
Also tones ideally should be neutral in tone. White or very lightly toned surfaces reflect a lot of light and surface information can be lost. Also low reflectance surfaces – very dark tones don’t reflect light particularly well and again, some surface information can be lost.
This doesn’t mean that your titanium mirror polished Guggenheim replica can´t be scanned, however. One technique you can use – is coating the surface with a dusting of light grey paint primer. You can remove this paint afterwards with thinners. (Note that I might recommend chalk dust and avoid thinners on objects with a plastic or painted surface.)
My first test subject was a human anatomical reference model, owned by a workmate of mine. It had plenty of detail, although I was worried the specularity (highlights) on the plastic model might be slightly detrimental to the quality.
It is really simple – You don’t even need a tripod, my3dscanner uses surface relief, textures and background information for 3D spatial reference and point cloud generation.
- – For one 180 degree side you should aim to take 25-35 overlapping shots. (Think like you’re making a panoramic image.) Get lots of angles – while keeping a high depth of focus. Both subject and background should be sharply in focus.
- – Make sure your subject is absolutely still. 3D scanning people in this manner is very very difficult.
- – Download the JPEG images and compress them into a .zip file.
- – You then need to register on www.my3dscanner.com/
- – Upload the images and be prepared to wait several days for the server to process your shots.
- – You’ll be notified by email when your 3D point cloud is ready, you should download it for safe keeping.
- – To view and clean up your model, MeshLab is a good opensource 3d processing utility.
Ideal set up
3-5 megapixels. (My test shot was done at 11mp, although its not necessary to shoot above 5, according to the website. In fact you’ll have a looong wait if you choose to shoot with a high pixel count.
50-125 ISO. I shot at 400 ISO I believe this resulted in some loss of detail. You want to minimise sensor noise that may interfer with the texture so turn the noise right down.
Any will do, but wide angle lenses are better suited to my3dscanner.com anything between 20 to 45mm focal length should suffice (Or equivalent focal length). If you have a compact digital camera, zoom out to the widest lens position.
Make sure your subject is evenly lit. Too much highlights and too much shadows may result in holes. If you’re shooting outside avoid the high contrast of the midday sun. Buildings or sculptures are best shot at midday when it is overcast.
The raw point cloud, prior to cleanup.
What I learned from my results
Their servers run 24/7 and are very busy, don’t be surprised to have to wait several days to get your point cloud back. It’s worth it, plus you can’t really complain as the service is free!
Don’t expect your first model to be amazing. My 3D scanned anatomy model is full of small holes. However, I was surprised my model came out as good as it did. The app seems to require textured surfaces to locate the object in 3D space. I would recommend initially starting with just one side of your subject. About 30-40 photos covering 180 degrees should be enough to see some good results.
For example – I noticed in my second model of a small indoor plant the plain glossy ceramic pot didn’t appear in the point cloud.
Finally, take extra care with your lighting, your subject should be well lit with diffuse even illumination. You need to keep your camera’s ISO speed low or it may interfere with the subject’s texture.
With some practice and controlling the light conditions, I believe it is definately possible to achieve excellent results with this web app. What are you waiting for, go shoot and lets see some cool models in the Ponoko showroom!