that i may have been much too harsh on Google for the subpar revenue generation from AdSense that many BB, and Blog publishers believe it should generate. This was just dropped off by storks at the cabbage patch or fresh skimmed from what bubbled up in the fermentation tanks...It's certainly fresh-baked, and possibly half-baked.
These sites have one thing in common, they are not based on static content. Most of the software used to generate these sites' content is of the PHP/MySql varieties, K5 is Perl, and although I haven't checked into it, I assume a heavy user of cgi calls. In my experience, the most commonly used CMS's, including blog and BB sites, suck at producing pages containing usable meta information, especially Dublin Core.
CMS Softwares' codebases are often anti-Semantic, and if one desires to have properly targeted content provided from ad-servers on their sites, they should assure that the generated page headers being served-up has been loaded-up with as many meta tags that provide simple and accurate Semantical Web data as can be implemented. Yes, for this purpose, I'd recommend getting back into the reprehensible time-sucking ghoul of a habit; keyword meta tag optimisation.
The expectation that ad-server bots are going to fine comb page content like a search engine spider is unrealistic, and if it did, would be liable to cause bottlenecks in unexpected and undesired places on the WWW. I have already noticed slow page loads directly attributable adservers' slow responses, not the site's main server. Sometimes it has been frustrating enough to contemplate remapping AdServers' IPs in my PCs hosts file to localhost/somethingshort/asFiller.html. The reason I have not done this is more my belief that market forces should be left alone whenever possible, and that websites have a right to be a part of the market, than it is sloth. Also, I don't know about your opinion, but server logs I am privy to have seen a big bandwidth biting uptick in new search spiders by private "vertical" search providers, most which are unlikely to generate any significant new traffic for a website. When time allows for it, I've instituted a new policy response to vertical search bots: Those that come knocking with forged user.agent strings posing as browsers, get IP banned instantly, plus a CIDR range calculated and regexed for. Those that do not follow protocols of spidering politness, and are quick, repeated page requesters, plus those that otherwise break spidering etiquette, I attempt to initiate communication with their corporate creator and overlord. If I do not receive what I feel is a informative and veracious reply, I kick them out via robots.txt, and If I catch either of these two classes dong end-arounds these light blocks, instead of contacting me, I get proactive with CIDR firewall filtering, and simply start dropping those requests externally. I do not think I'd appreciate 2 or 3 or 4 or.?. new bandwidth sucking spiders frequenting these servers because of Ad content services. (Sorry about the drift, it wills not to be severed from me.)
Getting back on target; most CMS codebases now offer metatag/header optimizers as add-ons, but implementing them is much more than a quick back-end plug enabling click to an autonomous forget-me functioning, and instead require both an understanding in head element coding, as well as time spent on every page to generate it.
That's pretty much how this thought process flowed presently.
[ Parent ]