e2initialmodel.py
This program will take a set of classaverages/projections and build a set of 3D models suitable for use as initial models in single particle reconstruction. The set is theoretically sorted in order of quality (best one is numbered 1), though it's best to look at the other answers as well.
Options:

input 
string 
The name of the image containing the particle data 

iter 
int 
The total number of refinement iterations to perform 

tries 
int 
The number of different initial models to generate in search of a good one 

sym 
string 
Specify symmetry  choices are: c<n>, d<n>, h<n>, tet, oct, icos 

savemore 
bool 
Will cause intermediate results to be written to flat files 

orientgen 
string 
The type of orientation generator. Default is safe. See e2help.py orientgens 

dbls 
string 
data base list storage, used by the workflow. You can ignore this argument. 
v 
verbose 
int 
verbose level [09], higner number means higher level of verboseness 
Results
When the program is complete, you will find several BDBs in the initial_models directory. There are 4 types of files:
model_xx_yy 
This is the refined model you would use as a starting model 
model_xx_yy_aptcl 
Contains the classaverages alternating with the corresponding projections of the model after the final iteration. Poor agreement between pairs is an indication of a bad initial model. 
model_xx_yy_proj 
Projections from the final round of refinement covering the asymmetric triangle. 
model_xx_yy_init 
The random, blobby initial model for this refinement. Just in case you want to see it. 
 xx is the run number. If you run the initial model generator a second time, this number will be incremented.
 yy is the model number. In theory these are sorted in order of quality at the end of the run, so 01 will be the best. However, this isn't very reliable. You should check all of the models, and all of the _aptcl files to find the best one.
How it Works
The program begins by making a pure noise image. This image is then lowpass filtered to produce a fairly random pattern of blobs, which is then masked. The size of the mask is a fixed proportion of the total box size, basically assuming that you followed the advice in the tutorial and used a box 1.5  2.0x larger than the longest dimension of your particle.
This random blob model is then used to seed a single particle refinement, very similar to that used in e2refine.py. However, in this case, the input images are just a few classaverages, not a massive stack of particle images, and usually these class averages have been scaled down from the fullsize images. In addition, a number of steps are taken to make the refinement run very quickly (and completely in RAM, the intermediate results are not stored to disk). For example, only the fast, correlationbased comparator is used, a large angular step is used, etc. This refinement is allowed to run for a relatively large number of iterations, and finally, the final model is written to disk as model_xx_yy. Additional diagnostic files are also written to disk.
This whole process is repeated tries times, producing a set of output files sorted in order of the presumed quality of each result.
Evaluating Results
This process is not guaranteed to give a good model. For most structures there are only 2 or 3 different 'local minima' that the refinement process could reasonably converge to, but this number increases as you shrink the input particles more and more. In some cases there may be as many as 1015 different possible pseudostable answers. These 'local minima' are almost always obviously wrong, but you must take time to look through the results to make sure you identify them, and if necessary, run the initial model generator again to look for more possible solutions. In most cases generating 510 solutions will get you at least 1 good enough structure for a high resolution refinement.
To evaluate the results, first look at the structure itself, and make sure it looks reasonable given what you've seen of the data and classaverages. Next, look at the aptcl file, which contains a projection of the final model corresponding to each classaverage, in pairs. Look through the file, and you should see all of the classaverages have reasonably good agreement with their projections. If you find a classaverage which is not wellmatched to any projection, then either you have a bad structure, or a bad classaverage. Reasonable agreement between these pairs is a necessary but not a sufficient condition for a correct reconstruction.
Finally, take a look at the set of projections of the 3D model and make sure that also seems sensible. If everything seems to fit together well, then you should be set, and can use the model to seed your higher resolution reconstructions. For real proof that your final structure is accurate, please see the single particle reconstruction tutorial, which discusses this point.