The Hess Report
The silo is a simple tube. I decided to actually model the bands running around the silo, as I wanted to be able to see them sticking out slightly on the sides of the tower, and texturing them would not be able to do that.
Torii are hard to model, so I used the NURBS Donut surface primitive, eventually converting it to a mesh and duplicating it along the height of the silo. The roof started as a single section which was then arrayed with the SpinDup tool. The chute is a flattened tube; the bands on it, a texture.
The roof in edit mode. The SpinDup'ed section is selected.
Wires and Cables
The electrical wiring on the face of the barn and the one leading offstage were bezier curves with bezier circles set as the BevObs in the edit buttons. Don't be afraid of these tools. They are easy to use, and give great results. Once I had them as I wanted, I converted them to meshes. Afterward, I was able to fine tune their thickness by entering edit mode and using alt-S to scale along the vertex normals.
Two different techniques were used to generate the grass, but first I'd like to talk a little bit about how I decided to do what I did. In 3D, you need to have enough complexity and randomness so the image does not look bland. On the other hand, complete randomness also looks computer generated, and too much complexity will both overwhelm your image, and shoot your rendering times through the roof. On the complexity issue, I was able to get away with much less grass than there is in nature: the grass ends just past the edge of the image, and seen from above, the grass that there is is unnaturally sparse. The low viewing angle of the camera helped the grass to appear thicker than it actually was. That took care of the complexity issue. As for randomness, that came into play in the modeling process, so let's talk about it now.
I'm a big fan of RipSting's Fiber generator, and I use version 2.03 at the moment. I did not, however, use it for the main grass on this piece. As good as it is, Fiber cannot make branches or leaves on it's grass, and I needed more detail than it could provide. I felt that the bulged head at the top of seed grass was an important visual for my image - it's what the eye is used to seeing in such a situation. Here was my process:
The grass clump on the left, the seed grass on the right.
I modeled a single stalk of seed grass, keeping it as low poly as I could. Working on a copy of the ground object (a fractal subdivided plane pulled around a bit with PET), I selected some sections of vertices and faces, then subdivided them. There was not a lot of rhyme or reason to it, but I didn't want to be completely random, either. The areas of denser vertices would receive more (and therefore denser) grass. I also used the Select Random verts tool a couple of times for some true computer generated noise.
Click the image for an animation of the ground subdivision process.
Then, I made the ground the parent of my grass stalk, turned on dupliverts, and made the ground mesh a static particle system. In order to get things oriented properly, I had to fool with the settings on both the particle system for the ground object, and tracking buttons on the grass. Once I was happy with the distribution of the grass, I made the dupliverted objects real.
Sample settings for the grass particles. Dupliverts is turned ON.
Note the TrackTo and Up settings for the grass object.
Grass as static dupliverted particles. Note the higher density where there are more vertices.
After that, I had a bunch of seed grass, all the same height and all rotated the same way. It looked like crap. So, I wrote a short Python script that applied a random but specifiable rotation and scaling to all selected objects. I ran it, and a few seconds later, I had some great-looking grass. Of course, the first time I did this, the grass was way too thick, one of the problems with my last image, so I went through this whole procedure several times until I achieved grass of the proper visual scale.
My complexity and randomness rules mentioned about indicated to me that this was not good enough. Even randomized and distributed, the single model of a grass stalk was too uniform. So, I did another pass of this whole procedure, this time using a model of a cluster of simple bladed, V-shaped grass stalks. Of course, I used a different copy of the ground object, with a different semi-random subdivision pattern to make the static particles, so that the different kinds of grass had varying densities in relation to one another. After applying the randomizing script to this new grass object, things were looking very good.
Finally, I went to RipSting's Fiber script for the green ground cover. It makes great flat grass, and that was all that I needed as an underlayer to complete the modeling.
Well, not quite. I still wasn't entirely happy with the randomness. It was too uniformly random. So, I selected different sections of the finished grass and shrunk them, by using the randomizer script to scale them to between .4 and .8 of their original height. This is why in the finished image you see the grass get smaller in a path leading up to the front doors of the barn, and why there are areas in the extreme right foreground where it is significantly shorter. Finished product: random but showing signs of intelligent and patterned interaction; complex enough to look right, but not so much that it was impossible to work with.
Previous 1 2 3 4 5 6 7 Next