Turtle Tennis

A dose of programming, sarcasm and pedantry for your viewing.


Welcome to Turtle Tennis. If you came here actually looking for turtles or tennis, then I'm afraid you've come to the wrong place. In which case, please see turtles or tennis.

Google product search feeds

For a while now, I've been working on generating product feeds for Google, so that they can list our products when people search the shopping area on the Google website Google has some guidelines for the format and content of these feeds and they are particularly strict about them. Some of the problems that I have encountered are:

Hopefully the above will provide you with some help if you're trying to get your product feed accepted or at least show you how strict the rules that Google apply are.


To increase the ranking of pages on your website which are crawled by search engine bots, you can use XML sitemap files. In the robots.txt file (you can add one if you don't already have one) in the root folder of your website, add the entry: "sitemap:sitemapfilename.xml". If you have a very large sitemap, you can split it up into multiple files, and use a sitemap index file to link to each of the individual sitemap files, then link to the index in robots.txt, using "sitemapindex:sitemapindexurl.xml", this can also be done to group your urls in separate files, e.g. you have three distinct sections on your website and there are too many URLs to fit in one sitemap file (>10MB) so you split them into three separate sitemap files, one for each section, and link to them from a sitemap index file.

Your sitemap does not need to be kept on the same webserver, because it is linked to from robots.txt on your site, this is considered safe. In practice you probably want control over the content of your sitemaps and the location in which they are stored, but you could potentially outsource the generation/management of your sitemaps to another company/department.

Your sitemap should not contain any links to files which are disallowed in robots.txt, this will cause errors or at least mean that the pages aren't crawled, either remove the page(s) from your sitemap or remove/change the rule(s) which blocks search bots from accessing those files.

Some search engine providers will let you set up an account for your website to let them know that your site exists and where to find it, provide more detailed information about your site than the search engine may be able to retreive from your website by crawling it and there may be an option to enter a URL to your sitemap/sitemapindex file.


Having spent over 4 years living in Southampton, I felt it time I should share some of my knowledge. So I've written some reviews of pubs in Southampton. If you don't know which pub to go to this evening and you're in Southampton, then you might want to take a look to see which ones sound most suited to you. Alternatively, you could read it to see which ones that I recommend and avoid them to avoid me, it's up to you.

Now that you've decided to go to the pub (good choice by the way) it's time to choose which area in Southampton you want to go out in:
the Highfield area, the Portswood area or the Bevois Valley area.

Encoding &s!

Yesterday I was fixing problems with xml compliance, mainly the encoding of & to & in html which is rendered on the page. I learnt some things about what should and what shouldn't be encoded and the problems that may arise from doing so. The idea was to replace & with & in all the urls in html on every page on the website. These are used in query string parameters, so when they get to the address bar, they need to be &, but when in the raw html, they need to be &. The obvious exclusion to this, is in JavaScript, when writing if(a=='a' && b=='b') the &s shouldn't be encoded and if a request is made in Javascript, the query string parameters shouldn't use &, otherwise you'll get amp; prefixed to the value of your query string variables, which is where my main problem occurred.

I'm dealing with a page which lists products and there's a drop-down menu with options for the sort order of products and an ajax request is made when one of these options is chosen, to update the products with the new sort order. We're using the value in each option to store the ajax url and one of the parameters is o=n where n is an integer between 1 and 7, which defines the order that the products should be listed in. The values are in the raw html, so the query string parameter has to have &s encoded, but when we call the function which updates the page, we use JQuery to get the value of the selected option and make a request with the value. JavaScript code will keep the amp; in the string that it extracts, causing the url that is requested to contain &o=n so that when you parse the query string looking for o=n, you don't find it!