By paul ~ March 23rd, 2009. Filed under: Resources.
For those of you who, like me, are just getting started developing with OpenGL-ES for the iPhone, one of the challenges is building some meshes to work with. If you don’t have lots of money to spend on the real "pro" tools, you’ll want to check out Blender (http://www.blender.org ). Blender is a feature-rich, but somewhat hard to learn, open-source 3D modeling and animation environment. It even has integrated physics simulation and scripting. To get up to speed on Blender, I highly recommend the Wikibooks tutorial Blender 3D: Noob to Pro . Once you face the Blender GUI for the first time, you’ll be very thankful for this grand effort at a tutorial!
Before I go on, I need to state that I’m a real noob myself and my needs at this point only include the creation of some meshes that I’m going to import into my application and then manage entirely there. I haven’t yet had a need or chance to try out boned meshes or any of the animation features. That being said, here are a few guidelines for developing your meshes in Blender with the intenion of importing them into an Oolong (PowerVR ) application as POD files.
- Go ahead and create your meshes in Blender, but recognize that rendering is going to be performed in your application, so don’t spend a lot of time trying to get it to look "just right" in the Blender renderer.
- You can set up initial light and camera locations in Blender. These are imported in the POD file.
- Create and apply your materials in Blender. Materials will be imported in the POD file (although I can’t find the Emission component yet.)
- Note that meshes with different materials on different faces will be split up into multiple meshes (with a single parent mesh.) For instance, a cube with a different material on each face will be split up into 6 different meshes, one for each face. Each of these "child" meshes will have a pointer to the parent node. More on how to deal with this in Oolong later.
- Right now, I’m applying textures to textured surfaces as place-holders only. This ensures that the UV coordinates are included in the interleaved vertex data. In Oolong, textures are imported from PVR files, and I haven’t yet figured out a path from Blender texture to PVR. If you can help out with this, please chime in! The result is that you will have to figure out the right texture settings through experimentation.
Once you have your meshes built, you’ll want to do the export/import dance to get to POD. Right now, it appears that the best path is from Blender to Collada 1.4 (via the Blender Export menu) then from Collada to POD via the PowerVR Collada2POD tool. (Collada2POD is in the utilities folder of the PowerVR SDK distribution.) Sorry, I couldn’t find a Mac OS-X version of the SDK, so I have to use a Windows PC for this step. Here are the settings I’ve come up with for the export and import.
Export to Collada 1.4 Settings
- Set the geometry type to triangles.
- Don’t set anything else. I tried messing with the "UV Image Materials" but didn’t find that useful.
- Under geometry options select (not all of these may be required, I didn’t experiment with dropping any out.):
- Vertex Colors
- Mapping Channels
- Flip V
- Interleave Vectors
- PowerVR documentation recommends sorted, indexed, triangle strips for performance, so I turned on
- Sort vertices with PVRTTriStrip
- Indexed Triangle List
- I haven’t selected anything special in the Vertex vector formats.
- The resulting POD file can be automatically opened in PVR Shaman for examination by simply setting up the path on the Post-Export tab.
I used the Shadowing example as the template for how to load POD and PVR texture files. Here are a few notes on what I’ve discovered:
- A "SPODNode" is created for each instance of a mesh, the camera and light, and each parent node (for composite meshes.) The Node has a link to it’s defining mesh, a transform, and a link to it’s SPODMaterial.
- A "SPODMesh" is created for each mesh and contains the interleaved, indexed, vertex data.
- You can draw the children of "composite" nodes ad-hoc if you like, because the GetWorldMatrix( ) methods automatically multiply by the parent’s transform matrix. However, if you’re doing physics, you’ll want to treat them as a unit (a single btCollisionShape for the composite node.)
- As mentioned before the Emission component of materials doesn’t come through, so I’m having handle that manually in my application.
- Again, I haven’t figured out how to get textures directly from Blender to PVR. I’m handling them separately. Create your texture externally, use it in Blender and convert it to PVR for use in your application and fiddle to get things right. (If you’ve got a better way, please tell us noobs about it!)
- April 24, 2009 — Discovered that if you have a material assigned to an object, and you have also assigned materials to faces, but the material assigned to the object is not used on any of the faces, the Collada exporter will crash. Make sure that all materials assigned to an object in blender are actually used. Delete any that are unused before attempting to export.
Note that there has been some recent discussion of this same topic on the Oolong mailing list, so I really recommend getting hooked up to that. In particular, there is someone there working on a direct Blender reader (into the ModelPOD structures), and mention of a blender reader in the Bullet physics toolkit. So, there may be more direct methods of using your Blender models in Oolong soon.
I’d really like to thank Wolfgang Engel, the PowerVR folks, the Bullet Physics team, and the rest of the Oolong contributors. My early efforts are showing excellent performance on the device and I’m having a blast with the programming. I don’t know where things are headed next for Oolong, but it can only get better!