Every Search Engine Robot Needs Validation
Your website is ready. Your content is in place, you have optimized your pages. What is the last thing you should do before uploading your hard work? Validate. It is surprising how many people do not validate the source code of their web pages before putting them online.
Looking at what your visitors and the robots need, you can easily see how making your website "search engine friendly", also makes the website visitor friendly.
For example, one project I worked on had many validation problems. Because of the huge number of errors generated by problems in the source code, the search engine robots were unable to index the web page, and in particular, a section of text with keyword phrases identified specifically for this page. Ironically, human users had problems with the page as well. Since humans are smart, they could work around the problem, but the robots could not. Fixing the source code corrected the situation for human and automated visitors.
As I said before, what works for your website visitors works for the search engine robots. Usability is the key for both your human visitors and automated robots. Why not provide the best chance for optimum viewing by both?