For all the great things that 2.0 introduced, Master Pages are somewhat of a double edged sword. They are great for streamlining a design but, in your specific case, they are a boon.
Here is the long and the short of it: first, the only way you are going to exclude a spdier from hitting X page (without using meta tags) is going to be to use a Robots.txt file and prevent them from crawling pages that way. Secondly, and more importantly, is your Meta Tags. To parapharse Aaron Wall from www.seobook.com:
since you have the exact same meta tags across multiple pages, if not your entire site, it is at best a wasted oppurtunity. This has to do with duplicate content and it is one thing that is covered in great length in almost every SEO
Book. Google particularly frowns upon this and, lets be serious, if your not on google you may as well not exist. On the flip side of the coin, removing your Meta tags completely may cause your site to NOT get indexed at all by other search engines.
Lastly, think of a spider as a regular user. If you have pages of your site that a real person can not go to without a username and password, so to will that hold true for the spider.
Lastly, if you want more information on SEO
and ASP.NET 2.0 I suggest picking up Jamie and Cristian's Book: Professional Search Engine Optimization with ASP.NET: A Developer's Guide to SEO
(You can follow the link in my sig) and you can read the exerpt chapter here (it has to do with URL Rewriting) http://www.wrox.com/WileyCDA/Section/id-305997.html
Read this if you want to know how to get a correct reply for your question:
Technical Editor for:
Professional Search Engine Optimization with ASP.NET
Professional IIS 7 and ASP.NET Integrated Programming