SEO – Sitemaps and the Robots.txt

Author: | Posted in SEO, Web 1 Comment
seo

This weekend I have been working a lot with sitemaps and the infamous robots.txt file. I have been trying to increase the SEO I am doing on my site as well as my client’s sites. A great little app that will automatically build your sitemap and upload it via FTP is called GSiteCrawler. This free piece of software will save you a lot of time from building a sitemap file yourself.

Gsitecrawler Application - Sitemaps

Also, if you have a website and are trying to increase your search engine ranking, there are another two things you need to do. First, you need to create a robots.txt file so that search engines can crawl your site. A general robots.txt file which allows a search engine to crawl through everything would look like this:

User-Agent: *
Allow: /

Some handy tools for this are Google’s Webmaster Tools.
And of course, you will want to submit your site to search engines. This combined with the crawling .txt are bound to increase your rankings and traffic. Some good sites for this are http://www.submitexpress.com and http://freewebsubmission.com/.

  • pen drives personalizados

    Considero que este este blog esta entre os mais interessantes para minha atividade. Tenho gostado navegar pelos novos artigos, sempre de excelente qualidade :D. Continuem assim!