1) A page with code that does not validate will always be indexed. Is it true or not.
2) It is advisable that a sitemap be created for each site. Is it true or not.
3) A page may not be indexed because of which of the following spider traps:
A - All of the above
B - Directives in the robots.txt file on the web server.
C - Robot metatag restrictions
D - Javascript-enabled dynamic navigation
4) You can determine which, if any, AND how often spiders are crawling your site by:
A - Doing a navigational search in your favorite browser.
B - Checking your page rank on Google.
C - Checking web server log files routinely.
D - Engaging in IP delivery or cloaking techniques.
5) An entire site can be banned by a search engine for violating search rules. Is it true or not.
I need the answers badly because I don't know the answers