• What did we learn today, kids?

    I learned today… that when Blender 2.49b generates a normal map in tangent space… it does so with znormals. I wish I knew better what that meant, but what was important about that is Unity expects normal maps to be based on xnormals. Basically, if I had a depression in an object it would look like a bump and vice-versa. It took the right google search to get the answer, but once I got it, it only took a minor invert of R and G color channels in GIMP to get them corrected.


  • Holy crap… I have a website!

    So I’ve had this site for… awhile now. I haven’t posted much of anything, but I may as well start recording my progresses again because hey… someone might actually give a crap at some point.

    I have oh… about 4 pending projects going at this point. The biggest news is that I’ve gone Unity! The bottom line… I have no real desire to learn how to deal with 3D physics and collision… and Unity does that so easily. I’ve already learned how to integrate with native code, and tweak Java side after building an Eclipse project. I’m pretty pleased with the results so far.

    So my projects…

    1. RecallIt – This is my straight Java, Android special ToDo list project. It will be so much more than just a regular ToDo list, but I don’t want to spoil it because I fully intend to get the sucker on the Market.

    2. 3D Die Roller Max – Just made that name up… For all intents and purposes is my Unity learning project. Hoping to actually sell this too…

    3. TTR – Through the Rift is alive and well, but in limbo because I’m not sure if I want to convert to Unity or not. I probably will…

    4. UnNamed Platformer (possibly “Quill”) – First joint project with 1 or 2 partners. Nothing really going yet… just being built in our minds at the moment.


  • OpenGL ES 2.0

    Moving to ES 2.0 because it’s the future, and according to everything I’ve read… will provide better performance. So… on with it. So far so good, I’ve gotten past most of the hurdles so all’s more or less well for the time being. Yay.


  • Time for an update…

    It’s been quite some time since I posted anything, so I figured I should whether anyone will actually be reading it or not. Things are going quite well with Through the Rift. I have a lot of engine work done, but of course have tons more to go. I’d gotten an animation for walking working well with matrix palettes (again… still working with OpenGL ES 1.1) but I’m in the midst of needing to implement a more global game solution. I have a few ideas I’m toying with, so I have a pretty good feeling things will be back on track in no time. Here’s to the future of TTR…


  • Mech Prototype

    Behold…

    Mech Prototype


  • Developer Tip 1: Phone GPU Limitations

    If you are like me and developing an OpenGL ES 1.1 game for Android (yes… I know that 2.0 is more modern… I’m just not interested in changing mid-project now that 2.0 has been available) then you need to know who your target audience is, i.e. what phones you’re expecting your game to be compatible with. Personally, since I’m quite sure I’ve got at least another year to go, I’m targeting all phones with a Snapdragon or better (so Snapdragon (several phones), Hummingbird (Galaxy S series) and the TI OMAP 3630 (Droid X and Droid 2(?))). In the next year or so, I have a feeling that these kinds of phones are going to be very wide-spread and that developing for them is really the best target if you’re planning on making an even remotely detailed OpenGL game.

    There’s one catch though. These phones all have 1 Ghz CPU’s but are, unfortunately, not all created equal. The big disappointment (at least for me…) is with the highly touted Snapdragon. It has an AMD GPU that just can’t compare with the PowerVR GPUs that are paired with the TI and Samsung processors. Sorry Qualcomm, I think you made a mistake here…

    Head over to http://www.glbenchmark.com/result.jsp and check it out. Grab any Galaxy S phone (notice they’re at the top of the list) and compare to any Snapdragon device. The differences are pretty great. And here’s the thing that really bugs me. The Extended Palette Matrix extension is supported on all of the devices, and the GL_MAX_PALETTE_MATRICES_OES should be 32… but the snapdragon has a hardware limit of 20 for this parameter. That… stinks. I have at least one model that will need about 24 bones for animation which means I have to split up the draw calls and my game structure for the elements. It’s not the end of the world, but it feels lame to have to make the change for one set of phones.

    Anyway, take what you will from this. My goal is to have my game run pretty darn well on Snapdragon devices, but really shine on the rest. We’ll see how it goes…


  • Captivated

    So I got a Samsung Captivate. Wow… nice damn phone. Between the Super AMOLED screen and the Hummingbird CPU (and substantially better PowerVR GPU than the AMD GPU that’s in the Snapdragon) I’m loving life. I unfortunately realize that I still need to develop with the Snapdragon in mind, but things feel so much easier now developing. No my slow emulation… and what is most important, no more limited GL Extensions.

    What sucks… is that I spent quite a bit of time working on animations… doing my own modifications to vertices and normals because I couldn’t so much as call a glGet() on the current ModelView Matrix. Now… I’m working on implementing a skeletal animation system with the Matrix Palette extension and having a substantially easier time with it (once I figured out how it worked…). I’m also considering doing lighting fully with textures… but we’ll have to see exactly how much trouble that will be.


  • Have a joint! Not that kind, hippie…

    So believe it or not… this video shows something that was none too easy. The way OpenGL ES works, as long as I’m not completely mistaken, you can not to transformations mid-way through drawing faces. Because of this, I was a bit torn on how to do a smooth looking joint animation, like an elbow for instance. Then I had an idea…

    Basically what I’ve done is take two OBJ output files from Blender and compare them to see what vertices are shared, then output the binary files that I’ve already been doing as well as an additional file containing the vertex “pairs.” In my native code, I’m rotating the vertices for the second object manually, that is, running through them all and multiplying them by my own rotation matrix. Then, I’m going through the paired up vertices and using them to set the matching vertices in the first object equal to those in the rotated first object, thus giving the effect seen here.

    Joint Demo from Jeremiah Sellars on Vimeo.


  • Let there be light! (and rotation…)

    Here’s a video of a quick demo of an cube with a texture loaded on it. Pressing in different screen areas affects the diffuse and ambient light. Those values are displayed in text in the upper right corner. Buttons to rotate in the Y and X directions provided by DayDream.

    Simple 3D Demo from Jeremiah Sellars on Vimeo.


  • Deep in the heart of the native jungle…

    So moving my Open GL code from Jave to the native environment is officially going well. I’ve just completed the last step in the initial test phase, which was getting all of the data for vertices, normals, texture coordinates, indices, and finally the texture image data loaded off of the sd card and used for rendering. I’m quite pleased to say that (after finding out that I have to use RGB565 color and that Red is in the higher bits) that it’s all working smoothly. It may be time to really head back into the design phase…