1) A page with code that does not validate will always be indexed. Is it true or not.

2) It is advisable that a sitemap be created for each site. Is it true or not.
3) A page may not be indexed because of which of the following spider traps:
A - All of the above
B - Directives in the robots.txt file on the web server.
C - Robot metatag restrictions
D - Javascript-enabled dynamic navigation

4) You can determine which, if any, AND how often spiders are crawling your site by:
A - Doing a navigational search in your favorite browser.
B - Checking your page rank on Google.
C - Checking web server log files routinely.
D - Engaging in IP delivery or cloaking techniques.

5) An entire site can be banned by a search engine for violating search rules. Is it true or not.

I need the answers badly because I don't know the answers

Similar Questions
    1. answers icon 5 answers
    1. answers icon 1 answer
  1. What is the purpose of a query language?(1 point) Responses to display a web page properly to display a web page properly to
    1. answers icon 1 answer
  2. Which general while loop definition is written correctly?A while (x is true) { // code } B if (i < 5) { //code } C while (let i
    1. answers icon 1 answer
more similar questions