We have been using the Sitemaps Script from the past few years for Generating Sitemap easily for our websites & blogs. These once generated would be submitted to Google, Yahoo, Bing, Baidu etc all of which would later find it easy to crawl our content and index them quickly. Normally when you have thousands of pages of content on your websites, at times it would be hard for the Search Engine robots to find your content, hence a Sitemap is always recommended to be Generated for your website.
In this article we are going to recommend you a paid Sitemaps Script which also offers you with a free sitemap generation for websites upto a maximum of 500 pages, while if you have more pages to be crawled and updated then you need to pay a one time fees of just 19.99$ for a Standalone version which would be uploaded to your hosting account and run from the same location. This paid version would enable you to crawl unlimited number of pages hosted in your server along with the flexibility of blocking certain folders, customizing the sitemap pages which would be exported in the form of .xml , .gz etc. The Final Sitemap would be generated in the following formats – HTML SiteMap, Text SiteMap, ROR SiteMap & XML SiteMap File.
Once you purchase a copy of this script, you will be given access to a downloadable file which would be on the name of generator.zip , download it to your computer and then extract the content of the zipped file using any application like Winzip, Winrar etc. Next run any FTP Software which you normally use to access your Web Hosting account and login to your ftp account with the logins provided to you by your Hosting Company.
Next either upload all the contents of the ‘Generator’ folder directly to your ‘public_html’ folder or else upload the folder as it is or create a new folder named ‘Sitemaps’ and upload all the files into this folder. This process can either be done by accessing the downloaded files from the left panel and dragging/dropping into the Remote Site > Folder.
Now you need to create two different files manually on your computer and upload to the root folder of your web host. These two files would be created using Notepad with no content and saved as ‘sitemap.xml‘ and ‘ror.xml’. Just open Notepad and select ‘Save As’ option to create these files. Once you upload these files to your web host, they would normally have a File Permission of 644 which has to be changed to 666, so that the script can Write content into it and update it with the Sitemaps Information.
Once this is done, you need to visit the sitemap section using your browser and click on the ‘Crawler’ option to start crawling of your website. Make sure that you select the option of ‘Do not interrupt the script even after closing the browser window until the crawling is complete’ because this option would make sure that the script would still run even if you close down your browser. In case of my blog the script took around 2 hours to complete and for bigger websites it can take many hours, hence running it in background is more recommended.
Once the crawling is completed, a report is generated and sent to the email you had set up in the configuration along with the Sitemap Generation Request Date, Time, Total Pages Indexed, Sitemap Files, Pages Size in MB. Links to the different formats of your sitemap and a list where all your Broken Links are listed. This option is very much useful because based on this you can remove all the broken links from your website and make sure that both the bots and visitors are sent to error pages.
Finally, if you like this script – Go get one copy of this Sitemaps Script