Thanks for that, engtech. Insofar as I could follow what they were saying, it does appear that I at least fall into that group. At least this bit is heartening:
The sandbox has also been observed to typically release sites into "normal" rankings en masse, which is to say that there have been virtually no examples of a single site "escaping" by itself. It appears that certain updates in Google's search engine release many sites all at once. Speculation about this centers around Google wishing to avoid the appearance of manually reviewing sites one by one, although other reasons have been proposed as well.
Although this advice is scary:
One of the most common elements suspected for sandboxing completely"natural" sites is their addition to blogrolls. These links are sitewides on URLs that frequently have many thousands of pages in Google's index and it appears on the surface that they can cause the link problems that lead to sandboxing. The best way to avoid this is to watch your logs for referring URLs and request to be removed from any blogrolls that are sent to you. With some luck, the sympathetic blogger will understand and remove you. It seems ridiculous to have to go to these extents to avoid sandboxing, but in the commercial reality of the web, it may, in fact, help you in both the short and long run. Naturally, if you aren't running a blog on your site, it's much easier to "stay off the rolls", but you also miss inclusion in great blog directories and traffic sources that can earn you high quality links (i.e. Technorati, Blogwise, etc).