An afternoon session on search engine optimization for truly huge sites touched on the needs of websites with an immense number of pages online.
How have our readers with massive sites handled the expanded demands on SEO? Tip us on your strategies at WebProWorld.
Jason Lee Miller from Murdok spent some time at the afternoon session on Big Site/Big Brand SEM today, and passed along some of the more pertinent information he gathered from the meeting.
IBM’s Bill Hunt, a Search Effectiveness Team Lead and co-author of the book Search Engine Marketing Inc, started things off by giving five key points on doing SEM for a real big site:
1.Develop the business case
2.Education and awareness
3.Remove barriers to spiders/crawlers
4.Leverage page templates to fix multiple pages
5.Leverage your site for links
Even with his knowledge of SEM, Hunt said there is no “secret sauce” that will get a site into the “Golden Triangle” atop Google’s page. SEM pros working with big sites need to implement familiar strategies, like ensuring pages welcome spidering, researching keywords and phrases, and populating pages with keywords relevant to the content.
Where the strategy changes is in scale. Hunt suggested segmenting the task into pieces. The project manager should develop short and long term goals to maximize the search development process.
The SEO team should check indexes with site:yoursite.com to see how many pages have been indexed by Google, et al. Verify pages get crawled and indexed regularly; this is critical to determining if there is a problem.
Compile a definitive list of keywords and phrases that should bring customers to you from search engines. Audit pages, and check for keyword inclusion. Any barriers to crawlability found need to be fixed.
Above all, track everything and find your metrics.
The New York Times chief strategist Marshall Simmonds delivered a big number for a big site – a 1,095 percent increase in search referrals for About.com (owned by the Times) since an SEO initiative began in 1999. The Times has 1.3 million pages available online, and saw a 30 percent increase after the start of its search initiative in May 2005.
Simmonds led off with a reality check for big sites: SEO will be a long term project, and it isn’t going to happen for free. “If you rely on the Internet for business, you have to have a search strategy,” he said.
When there are many web pages managed by several different people – from content providers to techies in different departments with different budgets and goals, the company needs a standardized approach to site design and format. The integrated approach for big sites is important because it can’t be done on a page by page basis.
That standardization means using templates for pages, just like Hunt mentioned previously. Each department in the company should have an SEO checklist, to account for items like a unique title for every page, annotations on every link, and ensuring pages are keyword-rich.
Text for graphics should be a swap made whenever possible, for speed and indexing considerations. Focus on the essentials like keywords, description tags, good content, and linking. A reference Simmonds likes for the latest lists of IP addresses and hostnames for search engines can be found at http://iplists.com/nw/ online.
Intel manager for Worldwide Search Martin Laetsch noted that when it comes to tools for big sites, a spreadsheet in Excel isn’t going to work for comparing the volume of data generated. Those big sites need to use tools for SEO.
Often there is keyword conflict and competition within different departments of the company, all with their own budgets and goals. Laetsch said the word “Pentium” was competed for by 9 different departments within Intel.
A way to handle this is to give everybody a common landing page with a pitch about their project, so nobody within is competing for keywords.
During the question and answer session, Miller asked about meta keyword tags, and since many people in SEO have said only Yahoo crawls those tags, why should sites use them?
Hunt said keyword tags were critical for internal search, while Simmonds noted that optimized misspellings were an area where keyword tags could help.
When a question arose about organic versus paid search, Hunt said that using both will get 92 percent of people seeing both to click through to the site.
Laetsch felt that bidding on keywords “degrades our brand,” while Simmonds said that in news it’s necessary to do so.
Simmonds also mentioned how About.com’s use of subdomains for the site “was one of the smartest things they could have done.”
Search engines treat subdomains like separate websites, and are a better choice than using creative URLs that go down several directories. Many crawlers may not go as deep into a site as the site publisher would like.
David Utter is a staff writer for Murdok covering technology and business.