With modern CMS, like WordPress, you may find duplicate content: the same content – the same post – is displayed with several URLs (individual post page, category page, tag page,…). Pandia has an article by Shari Turow.
Google doesn’t like this duplicate content. It get confused. It think that maybe you’re trying to spam. And it put these duplicate pages into the supplemental index. According to Chris Garret, you may see what is frozen in the supplemental index. My pages are here.
If you want to avoid this problem, you should tell Google and other search engines to spider only a part of you’re site. You may do it with robots.txt and auto generated meta tag.
In my opinion, search engines should comprehend in what cases duplicate content is not spam. SE should easily understand that I’m using WordPress and what my URLs may look like.