Daily Archives: June 26, 2015

7 Key SEO Activities That Can Now Be Automated

Published by:

Although it’s hard to keep up with the growing number of SEO tools that have been launched in the last few years (along with the new functionalities of the existing tools), it’s necessary to test them in order to identify how their features can support and help advance our SEO activities more efficiently.

This is especially true when it comes to tasks that are critical or highly beneficial to the SEO process but are complex and/or time-consuming to execute.

That’s why I want to share with you seven such SEO tasks that now can be partially or completely automated with the support of some tools.

1. Assessing Your Industry Traffic Potential
One of the first activities when launching a new website or SEO campaign is to assess traffic potential (ideally per channel) and identify the potential competitors in the market. Estimating this can be challenging, especially when starting to work in a new market that you don’t know anything about.

Nonetheless, SimilarWeb “Industry Analysis” reports can greatly help by allowing you to easily obtain the most important traffic data for any industry in many countries; it also shows the traffic per sources, most popular sites per channel, and trends.

However, remember to take these numbers as references, not absolutes; and whenever you can, validate with other data sources.

Search Industry Analysis

2. Identifying Keyword Opportunities For Your Sites
Finding new keywords opportunities is important when focusing your SEO process and establishing profitable yet still feasible goals.

In the past, doing this type of analysis was time-consuming, but now it can be completely automated with Sistrix‘s “Opportunities” feature. With this feature, you can include up to three competitors, and it will show which keywords you’re still not targeting for which these competitors are already ranking and the level of traffic opportunity and competition.

Sistrix SEO Opportunities

3. Identifying Related Relevant Terms To Use In Your Content By Doing A TF-IDF Analysis Of The Top Ranked Pages For Any Query
TF-IDF stands for “term frequency” and “inverse document frequency.” According to the OnPageWiki:

With the TF*IDF formula, you can identify in which proportion certain words within a text document or website are weighted compared to all potentially possible documents. Apart from the keyword density, this formula can be used for OnPage optimisation in order to increase a website’s relevance in search engines.
Although it’s known that TF-IDF has been used to index pages, there hasn’t been a popular tool offering it to identify relevant term variances of our topics that we should be using. This information can be used to improve our site relevance for other terms our audience uses.

OnPage.org includes a handy TF-IDF tool in their on-page analysis and monitoring platform, which can be used to identify more term variances or combinations that our competitors are already using, but we still aren’t (by analyzing both the top 15 page results and our own desired page to rank with). By focusing on terms related to our main keywords, we can increase our site content’s relevance for the desired topic.

tf-idf analysis

4. Visualizing Your Site’s Internal Linking
I have written in the past about visualizing a site’s pages and links as a graph to facilitate the analysis of a website’s internal linking, which was doable but took a lot of effort. The process required exporting the URLs crawled, then processing them with visualization tools.

This has now been made easy by the “Visualizer” functionality of OnPage.org. It not only allows you to automatically generate the internal link graph of any site, but it provides functionalities to browse, filter the number, show the relationship of links, and show only the nodes (or pages) that follow certain pattern.

This can be extremely helpful to better understand how a site is internally linked, the cardinality of the links, if there are any “orphan pages” or areas of the sites that are not connected with the rest, etc.

internal link graph

5. Getting All Key Optimization, Link Popularity, Social & Organic Traffic Data For Your Top Site Pages In A Single Place
Gathering the data when doing an SEO audit can be time-consuming. This data includes a website’s technical optimization, content, link popularity, current organic search traffic, and search engine rankings, which we used to obtain from different, non-connected data sources that were a challenge to integrate later.

This data gathering can now be largely automated thanks to URLProfiler, which directly retrieves much of the required data while combining many other tools’ data. For example, in order to get all the key SEO metrics for the highest visibility pages of your site, you can download the “top pages” CSV from the “Search Console” Search Analytics report, import them to Screaming Frog SEO crawler in the “list mode,” and crawl them.

Once crawled, you can import them directly to URLProfiler with the “Import from Screaming Frog SEO Spider” option. Then, you should select the additional metrics you want to obtain for these pages: Mozscape link popularity and social shares metrics, Google Analytics organic search traffic data (you’ll be able to select the segment you want), and Google PageSpeed and Mobile validation (these will require that you get and add a free API key from Moz and Google).

URL Profiler

Now, you can run URLProfiler and get the results in a few minutes in one spreadsheet: All the data from Screaming Frog, Google Analytics, MozScape link and social shares, Google PageSpeed and mobile validation for your top pages with the highest visibility in Google’s Search Console. It will look like this (and I can’t imagine the time I would have needed to put this all together manually):

URL-profiler

There’s no excuse to not develop a quick SEO audit for your most important pages, taking all the key metrics into consideration.

6. Getting Relevant Link Prospects With The Desired Requirements And Direct Contact Information
Obtaining a list of sites that are highly relevant to your business might be not that difficult — doing so when looking only for highly authoritative sites, from a specific country, with visible contact information (among other criteria) is a bit more complex.

All this can be easily done now with the LinkRisk Peek tool, which provides many advanced filters to only get the sites that will be relevant and feasible to use for outreach.

LinkRisk Peek

7. Tracking Daily Rankings Of Full SERPs For Your Relevant Keywords
There was a time when we tracked the rankings for our most important keywords, for both our own sites and our top competitors. Due to ongoing ranking fluctuations, sometimes we have new competitors that we were not tracking, and it is hard then to identify the correlations of gains and losses vs. them.

Additionally, once we got the ranking information, we had to analyze the pages to identify the potential reasons for the ranking shifts. We did this using tools to obtain the domain/page link popularity, among other factors.

This is now easier to do with tools like SERPWoo. Rather than tracking specified URLs (yours and your competitors’), SERPWoo tracks the top 20 results for your keywords by default. It also includes useful metrics such as page and domain link popularity, social shares, etc., to help marketers more easily analyze the potential causes of a rankings fluctuation.

SERP Tracking

I hope that these functionalities help you as much as they have helped me! Which other SEO activities are you now automating that used to take you a lot of time? Please, feel free to share in the comments!

Use Schema Markup With Caution it my be classified as spamy!

Published by:

Schema markup (also known as structured data markup) can be great way to improve search engine content discovery, indexation and organic search visibility. Some structured data markups feed into Google’s Knowledge Graph, appear in local results, and generate Rich Snippets — all of which is great for improving organic search visibility and click-through rate.

But now, structured data can potentially hurt your site if not used correctly, due to recent “spammy structured markup” penalties from Google. In March 2015, Google updated its rating and reviews Rich Snippet policies, stating that these types of snippets must be placed only on specific items, not on “category” or “list of items” landing pages.

In Google’s recent Quality Update, it seems quite a few sites were hit with Structured Data penalties. Here is an example of a manual Structured Data penalty message sent by Google in the Search Console (formerly Webmaster Tools).

Spammy Structured Data Markup Penalty

The penalty message reads as follows:

Spammy structured markup

Markup on some pages on this site appears to use techniques such as marketing up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google’s Rich Snippet Quality guidelines.

A penalty can be algorithmic or manual. A manual penalty can be partial or site-wide. Google has stated:

In cases where we see structured data that does not comply with these standards, we reserve the right to take manual action (e.g., disable rich snippets for a site) in order to maintain a high-quality search experience for our users.
Obviously, this is something that you do not want to happen to your site. Webmasters should now audit their schema markup implementation on an ongoing basis to avoid this penalty.

How To Avoid Structured Data Markup Penalties
By adhering to a few simple rules, you can avoid a manual penalty on the basis of spammy markup. Here’s my advice:

Ensure Structured Data implementation aligns with Google most recent guidelines. You can review Google’s guidelines and policies for structured data here and here.
Test markup in the Structured Data Testing Tool. Before going live with your schema markup, validate it using Google’s Structured Data Testing Tool. Look for errors and address them accordingly.
Test Schema Markup in Structured Data Testing Tool

Monitor the Structured Data report in your Google Search Console account (formerly Webmaster Tools). This report will show you your website’s Structured Data indexation and errors.

Structured Data Errors Report

Note: If you don’t have any Structured Data implemented on your website, you will see the following message on the report screen.

No Data in Structured Data Report

Monitor Google’s Webmaster Blog for the latest Structured Data updates and news. The Webmaster Blog can be found here: http://googlewebmastercentral.blogspot.com/.
What To Do If You’re Hit By A Structured Data Penalty
If you notice your site’s rich snippets have disappeared in organic search, or indexation in the Structured Data report is down, or you receive a clear message of a spammy markup penalty in Google Search Console, then here are the steps you need to take to address the issues.

Review the Structured Data Report in your Google Search Console report. Check for errors being reported. If errors are being reported, look into the specific pages/URLs that are generating the errors. To view details, simply click on the structured data category.

Structured Data Errors Audit

Here is a screenshot of what the detailed report looks like. Notice the filtered tabs and the information for the respective markup type.

Schema Errors Detailed Report

Cross-reference the respective schema markup implementation and errors with Google guidelines and Schema.org. Have your developers fix the errors — and, as mentioned before, test your markup in the Google Structured Data Testing tool before publishing on your site. If you’re using JSON-LD format for your Schema implementation, you can also utilize the JSON-LD Playground tool to test your code.
Submit a reconsideration request once the issues are fixed. If you were hit with a manual penalty for spammy structured markup, you will need to submit a reconsideration request after you have fixed the structured data errors on your site. (You can read more about the Reconsideration Request process here.) Here is an example of a reconsideration request for a structured data penalty:
Google Webmaster Team,

Example.com was hit with a manual site-wide penalty for “spammy structured markup” and has since updated schema markup implementation to align itself with Google’s guidelines and best practices. Upon investigation, we noticed that Example.com’s schema markup was very outdated and not properly implemented from a code perspective, which was the primary cause of this issue.

We have thoroughly reviewed and addressed the errors; all outdated schema markup has been removed from Example.com. Since then, Example.com has hired [SEO agency] to confirm proper alignment with Google’s guidelines by both manually reviewing the source code as well as running tests with Google’s Structured Data Testing Tool. Additionally, [SEO agency] will aid with the QA and monitoring process to avoid further issues with any new schema markup implementation on the site.

Example.com’s goal is to have a high quality website that provides value to its users and meets their needs, as this aligns with Google’s guidelines. We kindly ask that Google process this reconsideration request to remove the manual spammy structured markup penalty from Example.com. Further documentation can be provided upon request.

Thank you,
John Doe
Example.com

There has been a lot of discussion lately about Google’s timeline for processing reconsideration requests. At Elite SEM, we were able to get a Structured Data manual penalty removed in under a week for one of our clients. Here is a screenshot of the messages.

Manual penalty Revoked by Google

Along with monitoring your website’s link profile and content quality on an ongoing basis to avoid Google Penguin and Panda penalties, you also need to pay attention to your structured data markup implementation to avoid a spammy structured markup penalty.

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.

Be a part of the world’s largest search marketing conference, Search Engine Land’s SMX East. The robust agenda covers the latest tactics in paid search, SEO, mobile, analytics and more. Register today and save $300, or come as a team and save 10%-20%.