OpenAI’s latest endeavor, Shap-E, is a model that allows you to generate 3D objects from text, not unlike how Dall-E can create 2D images.

According to OpenAI, Shap-E is “a conditional generative model for 3D assets. Unlike recent work on 3D generative models which produce a single output representation, Shap-E directly generates the parameters of implicit functions that can be rendered as both textured meshes and neural radiance fields.”

The company’s GitHub posting goes on to explain how Shap-E is trained on a combination of mapping 3D assets and a conditional diffusion model.

However, this free-to-run program is a little more challenging to install and set up than the company’s ever-popular ChatGPT, as was explored by Tom’s Hardware.

You can download the Shap-E model on GitHub at no charge and access it on Microsoft Paint 3D. It also works when converted into an STL file, which allows the renders you create to be brought to life via 3D printers.

While this basic knowledge of the Shap-E model might seem simple enough, some tech savviness might be required to get the model installed and running.

The publication’s editor-in-chief, Avram Piltch, tested out Shap-E, which he claims took him eight hours to wrap his mind around. He added that OpenAI offers little by way of instructions outside of explaining that you should use a Python pip command for installation.

Once installed, Piltch says he was able to test prompts with color-animated GIF files and monochrome PLY files, with the animated GIFs being favorable, he noted.

Some prompts included a shark, a Minecraft creeper, and “an airplane that looks like a banana,” all of which had varying levels of quality depending on their file type. Piltch also used the model’s function, which lets users upload a 2D image for conversion into a 3D object.

The editor noted that those attempting to install Shap-E and render 3D objects should keep in mind that the model requires a lot of system resources from a PC.

In particular, Shap-E is compatible only with Nvidia GPUs and requires high-performance CPUs to render in a matter of minutes as opposed to hours.

Editors’ Recommendations








Source link

Leave a Reply

Your email address will not be published. Required fields are marked *