Skip to main content

Outback Steakhouse

Improving SEO During a Major Site Migration

BEFORE CLICK LABORATORY

Migrating a Large Complicated Website

We recently worked with Outback.com on the release of their new website.  With a large number of organic visitors and an aggressive use of new technology, there were lots of concerns about the impact of the new site on SEO and incoming organic traffic.  With such a large volume of organic visitors, small losses in organic traffic would mean big losses in revenue.  Click Laboratory worked weekly through the entire project to ensure that search rankings weren’t going to be negatively impacted but should actually improve organic traffic.

The new website uses Angular, which is basically a way for making the entire user experience more like an app and less like a static website. Content changes constantly on the site and it has a lot of moving parts.  This site truly is on the leading edge of what can be done for a restaurant.

But these innovations presented a challenge with the search engines which are having a hard time keeping up with changes in technology.  The task at Click was to make sure the site was able to do what the Digital Team at Bloomin’ Brands wanted while helping SEO efforts.

Here are some of the items that Click looks at when deploying a new site to ensure SEO isn’t harmed such as Outback.com.

Content Issues

  1. 301 Redirects. Any pages that are not going to be on the new site or are having their URLs changed should have a 301 redirect set up so that search engines know where the page moved to.
  2. 302 Redirects. These are temporary redirects in case you need to launch with some of your pages not quite ready to go yet but you just have to get the new website out there.
  3. New Content. Hopefully, a new website means you are launching new and improved content as well.  New content should always have a purpose aligned with the primary goals of the website.
  4. Page Structure.  Make sure that your content is organized well for people, not just search engines.
  5. Legacy Page Content.  You might need to update that old content that is still valid, but keywords or search engine rules have changed.

Crawl Issues for SEO

  1. Tracking 404 Errors.  If you have a large site with lots of legacy links, releasing a new website could cause a lot of 404 errors to show up that you don’t know about.  You’ll want to create a redirect for the important ones and a good 404 error page for everything else that actually helps people.  Keep your 404s to a minimum if you can.
  2. URL Hierarchy.  Search engines still think linearly though they are making improvements.  This means they haven’t figured out how content is organized and related unless it has great linking between pages or the content is organized into a tree structure.  The easiest thing to do if the content doesn’t really link to each other is to put everything into a tree structure.  This makes the most sense to your visitors as well.  It’s also easier for you to manage.
  3. Canonical Tagging.  If you don’t get this one right, you’re in for a lot of headaches.  A canonical tag tells search engines what page is the official version of the content.  This is extremely useful when you have multiple landing pages with almost the same text (such as you might use in campaigns or variations of the homepage). Double and triple-check your canonical links because you don’t want to accidentally tell Google to look in the wrong place for your most popular pages.
  4. Updating Offsite Links.  If you’re a popular site like Outback, you will have a LOT of links out there on the Internet. You want to try and get as many updated to the correct URL if you can, especially if you are using services such as Yelp that maintain links to your content for rich snippets or in other apps.

Technical Issues for SEO

  1. Sitemap.xml Updates.  Even if your site can be fully crawled by search engines, it is always a good idea to publish a sitemap.xml file and tell the search engines where to find it.  Its a good backup when you’re about to launch a new website that you aren’t positive search engine robots can crawl.
  2. Valid Schema Markup.  This is very important if you are a company with lots of locations.  Schema information is the best way of telling search engines where to find the best and most accurate information about your local stores.
  3. Secure Connection.  The Outback.com website switched from an unsecured HTTP to a secure https.  In the past, Google and Bing used to consider this switch as a brand new website they were seeing.  This meant all of your history was gone.  Fortunately, this is no longer an issue, so don’t worry about it.
  4. Angular Website.  Many websites are changing to using some type of JavaScript framework. Whether it is Bootstrap, Angular, React or something else, they can create a great user experience in the hands of a great team like Bloomin’ Brands, but the search engines really have not kept good pace with crawling these effectively.  Be sure to do a lot of heavy testing ahead of time with tools like Screaming Frog to make sure your content can be seen by standard crawlers AND mobile crawlers.
  5. Website Speed. Speed seems to be an ever-increasing importance to search engines.  Of course, it has always been an issue for visitors.  There are many tools such as Pingdom and Google Speed Test to test your site speed.  But just because your site doesn’t give a great score on Google Speed Test doesn’t mean your site isn’t fast.  I have seen sites actually perform better with a lower time to load and a worse Google Speed score.  You’ll have to find a balance of course on what is possible vs your spend but make it as fast as you can with the resources you have.

Analytics

  1. Google Tagging.  Analytics doesn’t affect SEO, but they are your main tools for knowing what is going on.  If you use Google Analytics, it is better to use Google Tag Manager.  You can put several of your javascript libraries here which can help load times plus Google Tag Manager gives you the ability to be much more granular with your web analytics than just Google Analytics.
  2. Webmaster Tools.  Through any rollout, be sure to regularly check Google Search Console and Bing Webmaster Tools. These will give you a warning sign if anything is going wrong with your new website. Also remember to update the settings here for the location of your sitemap.xml file because you don’t want them looking at a dead file or worse, an old file when you have a new website.

WITH CLICK LABORATORY

Doubling Targeted Organic Traffic

So how did this all work for Outback.com?  Working with the team a large amount of new content was added that focused on steak-related terms and increased the content for menu items.  The graph below (sorry we can’t show numbers) resulted in almost double the number of organic visitors from search engines for menu items.  There are many things still to be done that will drive these numbers higher, but our teams continue to work together to find new ways to increase organic traffic.

organics

  • Get started

    1Step 1
    2Step 2
    Name(Required)

  • download-now
    Step by Step SEO Conversion Checklist

    • This field is for validation purposes and should be left unchanged.

    Easy to Print for Daily Use