Category Archives: SEO

What This Startup Can Teach CMOs About SEO

Published by:

If you’re a Chief Marketing Officer at a digital business in 2017, chances are a large part of your time is already taken up by dealing with SEO. And thanks to Google’s ever-changing algorithm, what you learn today may not be true tomorrow, meaning you constantly have to stay up to date on the latest algorithm updates and SEO trends. Luckily there are a few constant lessons that remain true throughout algorithm updates and changing times that you can apply to build a future proof site.

The founders of Los Angeles based Everipedia, Inc. noticed that Wikipedia’s model for search engine dominance is ripe for disruption and innovation. They set out to redesign the online encyclopedia for the modern age. To do that, they are required to command a powerful search engine authority similar to Wikipedia’s dominant presence throughout Google’s results. What started out as a small project in a UCLA dorm room has now turned into one of the world’s largest encyclopedias with millions of users and a company valuation of $22 million.

Below, Everipedia’s founders share their most important optimization lessons for CMOs that will help bring your website to the top of Google’s search results.

Focus On Mobile Design And Usability

In 2016, mobile overtook desktop as the primary device people use to browse the web and Google has been quick to update their algorithm to make it more mobile oriented. Many industries and websites are starting to see their percentage of mobile traffic steadily climbing. But even though responsive design has been around for a while now and is well- established, a majority of websites tend to fall short on their mobile usability.

Theodor Forselius, the Head of Design describes what they have done in regards to mobile optimization: “At Everipedia we have actually focused more on our mobile functionality and usability than we have on desktop. All of our pages on mobile are built with Google’s Accelerated Mobile Page(AMP) framework which gives our pages priority in Google’s SERP over competitors.

The AMP framework also significantly improves our page speeds on slow 3G/4G connections which in turn decreases the bounce rate and signals Google that the page is user friendly.”

The lesson for CMOs: If the desktop version of your site is better than the mobile version, your priorities are misplaced.

Can we keep doing same for SEO in 2016 as we did in 2015

Published by:

 

Jayson DeMers

By Jayson DeMers Founder & CEO AudienceBloom

The SEO industry is volatile, and every month something new seems to shake up the scene and force us to reevaluate our priorities.

Best practices in 2010 don’t have much in common with best practices today, yet we believe that many of today’s best practices will be relevant indefinitely.

As we near the end of 2015, it’s important to consider which elements of your SEO campaign will be relevant throughout 2016 and which ones might expire or change in unfamiliar ways.

What’s changing?

In 2015, we witnessed a host of changes to the SEO landscape, from tweaks to ranking factors to shifts in potential visibility. In 2016, I’m anticipating more changes along these lines.

By looking back, we can determine Google’s (and other search engines’) priorities, and use those to estimate changes that are around the corner.

Social media will become more important for search visibility

Currently, social media probably plays a minimal role in directly influencing your rankings, though it likely plays strong indirect roles in doing so.

Greater social signals (such as users sharing your content or interacting with your brand) can help you rank higher, but for the most part social media serves as a great external channel to generate more inbound traffic for your site.

However, Google and other search engines are working harder to incorporate social media posts in new ways. For example, Twitter tweets are now embedded in certain search results.

google-embededresults

As this trend continues into 2016, posting on social media will continue to grow in importance to search visibility, though probably not in direct correlation to your site’s rankings.

‘Desktop-focused SEO’ will begin its descent into irrelevancy

In 2015, mobile traffic finally surpassed desktop traffic, and Google released its so-called “Mobilegeddon” update to phase out any sites that weren’t optimized for mobile devices. 2015 was the year mobile became the dominant form of web traffic, and 2016 will be a continuation of the rise of mobile.

Google’s own John Mueller stated this year that mobile-only sites (i.e., sites without a dedicated desktop version) suffer no ranking penalty.

Google has all but abandoned desktop-focused SEO, and you should too as we move into 2016.

Information-based content traffic will cease

Content that provides general information is becoming obsolete. This is in part due to the fact that online content is becoming oversaturated, but even more so due to new technological developments like the Google Knowledge Graph and Windows’ Cortana.

Digital assistants and advanced algorithms can now give users immediate information without ever routing them to an external site.

Instead of trying to write about that general information, shoot for more niche, unique topics.

google-answers

External links will change

External links have been shrinking in importance for the past three years or so, but new forms of link building have arisen. Brand mentions, which don’t use any explicit link, and off-site reviews are serving as new forms of off-site authority building.

Even newer forms of link building, like links to specific sections within apps, will grow in importance in 2016.

Local SEO will evolve further

The big local SEO shakeup in 2015 was the introduction of the local 3-pack, but thanks to increased interest in wearable technology, greater activity of local businesses, and general consumer needs, expect to see more local SEO changes in 2016.

Reviews and local citations will become more important, and geographic-based searches will become even more specific, serving at the neighborhood level instead of a city or region.

What’s staying the same?
Now that we’ve seen all the ways SEO may change in the next year, I’d like to focus more on what’s staying the same.

I’d like to believe that certain best practices really are timeless, or at the very least, that some best practices have a few more years left in them.

Keep these practices central to your SEO campaign well into 2016, as they aren’t in any immediate risk of being phased out:

Content is still king. Despite some forms of information-based content starting to lose out to digital assistants and aggregated material, unique, quality content is still your best friend.

People still need opinions, insights, entertainment value, and personality and it’s still going to be your job to give it to them in 2016.

On-site optimization is still about user experience. Some on-site factors are growing or shrinking in importance. For example, site security will be even more important in 2016.

But the bottom line is that on-site tweaks are still focused on user experience. If a change would make your site faster, safer, and easier to use, it’s probably good for SEO (and even if it isn’t, it will help your conversion rates).

Authority building still occurs off-site; to build a reputation, you still need off-site signals like inbound links, social signals, and reviews.

As I mentioned above, the nature of external links is evolving, but brand mentions, off-site listings, and consumer reviews are filling that gap as a new form of off-site authority building. The more relationships you can build with off-site authorities, the better.
Obviously, there are more best practices than these to consider when you’re structuring an SEO campaign for the future, but these blanket concepts will help you understand your main priorities.

There’s a significant degree of uncertainty with these predictions, as historic trends and patterns of growth don’t necessarily dictate a consistent future, and timing, of course, is sensitive to hundreds of unseen variables.

One thing is certain, however; SEO in 2016 will not be the same as SEO in 2015. Technologies, systems, and trends change too rapidly to support any one set of goals or practices for long.

Stay cognizant of industry-related changes and work quickly to adapt when things shift—as long as you keep a reasonable pace of development, you should have no problem outperforming the competition.

How to Fix & Prevent Duplicate Content Issues

Published by:

International SEO is becoming increasingly important for online companies to meet their ultimate aim of growth. Due to the Internet, expanding your business to any country in the world is fortunately a click away. All you need is an optimized website that caters to audience across national borders. However, doing so without duplicate content can be easier said than done.
First, there’s the major problem of a language barrier. There are also international search engines that go well beyond Google and Bing. For example, Yandex is the preferred search engine in Russia and Baidu is popular in China. Even when you simply translate your content into different languages, you risk being penalized by Google and other search engines for duplicate content. In this article, I’ll look closely at the common content issues businesses commonly face with international SEO.
Debunking Myths About International Seo

clip_image002
Let’s start by bursting some popular myths circulating around the international SEO realm so that you can stay on the right track.

Myth 1: There’s only one way to effectively penetrate a different country market and this is achieved through different domains.
Fact: All you need is a website that gives out good signals in a particular region, or globally for that matter. There are three approaches for doing so: sub-directory, sub-domain, and ccTLD. You can choose any one according to your business’s budget and team.
Myth 2: You should buy the maximum number of domain names.
Fact: Remember that buying multiple domain names is okay until you’re doing so to cover common misspellings of your brand. Beyond that, multiple domain names hold no real value. In fact, I would venture to say that they’ll hinder your international SEO efforts. Page redirections weight heavily on your website by slowing page download times.
Myth 3: You need .com domains to facilitate the creation of subfolders meant for international SEO.
Fact: The common notion that .com is foolproof when it comes to targeting international traffic is false. Businesses can create subfolders in other gTLD’s such as.org and .net. You can also choose to go with ccTLD.
Myth 4: Domain names with keywords can’t fail.
Fact: Having keywords in the domain name doesn’t guarantee traffic. I recommend you to focus on building your brand instead of hunting for such exact match domains.
Myth 5: Google Translator is a perfect tool.
Fact: At its current form, Google Translator is far from being flawless. Ideally, you hire an expert fluent in the foreign language to translate the content on your behalf for better results.
Myth 6: Keywords are the same worldwide.
Fact: This is a major misconception that can drag down your business because keywords vary by region. For instance, singulars and plurals end differently in German. Use a tool like Ubersuggest to zero in on the right keywords for your international SEO campaigns.
Myth 7: One approach for all countries will work.
Fact: Our globe is extremely diverse, so your marketing campaign needs to change based on each culture’s uniqueness. Keep in mind that holidays and festivals vary from country to country.
The Major Issue In International SEO

clip_image004
First and foremost, the biggest problem that comes in international SEO is finding the right approach to target different countries. Second, most businesses face challenges in avoiding the problem of duplicate content. Writing unique content for each page per the location isn’t the best approach. This will affect your budget costs and squeeze valuable time.
Avoiding Duplicate Content in International SEO

Before I get into the topic of duplicate content, you must understand that you can create either a multilingual website or multi-regional website. Here are the differences:

Multilingual website: Multilingual websites offer content in more than one language. For example, your website could include two language versions in Latin and English. Yet, you might still target users in the United States only.
Multi-regional website: Multi-regional websites target customers from different regions. In this case, you might have two versions of your website for two regions. One version might be targeting the UK, whereas the other version targets the US. Both would be in English. Of course, you could also have two versions for two regions in two different languages.
How to Manage Multilingual Versions on Your Website?

Content in different languages is not considered duplicate content if it’s done manually with correct grammar and intent. However, using auto-translated text from tools like Google Translator may create duplicate content issues. You can use robot.txt to block the search engines from indexing automatically translated pages, which can help avoid duplicate content. When Google indexes incoherent text, it might regard such content as spam and block the page. Steer clear of translating boilerplate content into different languages as it negatively affects user experience and isn’t acceptable by search engines.

For international SEO, it’s best to use a single language with a navigation tab for visitors to choose any language they speak. Text should be manually translated. Ensure that each language version can be discovered easily. Refrain from using cookies to guide a user from one language to another. Automatic redirection is also a bad idea because it prevents visitors from exploring the website the way he or she wants. It can also limit the search engines from indexing your website in its entirety. Use interlinking between languages so that users can land on the right language with a simple click.

Preferably, your URL should tell the user what language they’re getting. If you want a French user to click on the French version of the text, the URL must contain French words without any English. Google doesn’t use code-level information like the “lang” attribute to understand a website’s langauge. The search engine reads the content of the page to understand it. I’ll cover how to use the hreflang tag instead to guarantee 100 percent safety.

Targeting a Specific Country

Google encourages website owners to inform them of their targeted countries for enhancing search results. The search engine has set aside the following elements of international SEO for this purpose:

ccTLD (country-code top-level domain): Each ccTLD relates to a specific country. For instance, .com.au is used for Australia, .in is used for India, and .de is used for Germany etc. Using the appropriate ccTLD will give Google the clear indiciate that your business is targeting a particular nation.
Location of the server: Though the location of the server is not always a definitive indicator, Google takes it into consideration using the IP address. Google does understand that some websites make use of content delivery systems and may be hosted in a different country to provide better web server infrastructure.
Use geo-targeting carefully: Geo-targeting is an international SEO tool that can be used to define the targeted country present in the search console. But, you must be careful with it. If you are targeting countries by using ccTLD, then it doesn’t make sense to use it. Geo-targeting is usually used by websites with gTLDs, such as .com or .net. It makes the most sense for generic top-level domain names not affiliated with countries.
Don’t ignore your address and phone number: You do not get more local than your address and phone number. Having a physical international address will boost your authority. This is where Google My Business plays a big part.
Tackling Duplicate Content on International Sites

clip_image006
It’s common for websites to provide similar or the same content in different languages when targeting different regions while having different URLs. Google is okay with this as long as the users are from different countries. Your website will not be penalized when translation is manual and accurate. Even though Google still prefers unique content for each version, it understands that having unique content can be quite tough. Google clearly states that you don’t need to hide such content by not allowing Google to crawl it using a robots.txt file or no index robots meta tag.

The circumstances are entirely different if you’re providing the same content to the same audience through two URLs. Let me explain this with an example. Imagine you’ve created yourbusiness.com and yourbusiness.com.au. One targets the USA and other targets Australia respectively. Since both are in English, this will cause duplicate content. Luckily, it can be easily solved using a hreflang tag, which is widely accepted by all search engines globally.
Using the “Hreflang” Tag

As I mentioned earlier, the hreflang tag protects international SEO campaigns from being penalized with duplicate content. It’s usually required by businesses that cater to different languages or countries through sub-domains, subfolders, or ccTLD. The hreflang tag also is important if you have multiple languages for one single targeted country. Here’s how you can go about implementing it:

Step 1: First, we must handle language targeting. You’ll have to list out the URLs that have equivalents in different languages. Any stand-alone or non-equivalent URLs would not need the hreflang tag, so don’t list them.
Step 2: Now comes setting up the tag. This is what a general hreflang tag looks like:
<link rel=”alternate” hreflang=”es” href=”http://es.general.cz/”/>
Let’s envision that the page in question is www.mysite.com/page2.html and you want a German version of it.
You’ll simply change it to
<link rel=”alternate” hreflang=”de” href=”www.mysite.com/de/seite2.html”/>.
For a Spanish version, you’d change it to
<link rel=”alternate” hreflang=”es” href=”www.mysite.com/es/pagina2.html”/>
All you need are the country-wide codes. Repeat the process for the URLs that you narrowed down during step 1. For having a site that targets different countries in same language, you’ll use code like:
<link rel=”alternate” hreflang=”x-default” href=”http://www. xyz.com/”/>
<link rel=”alternate” hreflang=”en-gb” href=”http://en-gb. xyz.com/page.html”/>
<link rel=”alternate” hreflang=”en-us” href=”http://en-us. xyz.com/page.html”/>

Here the hreflang=”x-default” is used to create a default common page for all countries. This is generally the homepage or another neutral page for all countries.
Step 3: Please note that the hreflang tag should only be placed before the closing of the </head> tag and the tag of self-page shouldn’t be added. For example, the page
http://en-gb. xyz.com/page.html
Should only contain the alternate versions like

<link rel=”alternate” hreflang=”en-us” href=”http://en-us. xyz.com/page.html” />

and for page http://en-us. xyz.com/page.html/, the tag should be.

<link rel=”alternate” hreflang=”en-us” href=” http://en-gb. xyz.com/page.html” />

After implementation you can check that what you’ve done works properly by logging into your Google Webmaster Tool account. Proceed to “Search Traffic” and then “International Targeting.” If the hreflang tags were placed properly, you’ll be able to test them utilizing the feature presented there. When problems ensue, try using the hreflang tag generator tool to make things easy.
Common Mistakes to Avoid

Incorrect use of language codes: All tags should contain codes as per ISO 639-1. Using incorrect ones will negatively impact your international SEO.
Missing confirmation link: If page A links to page B, page B must link back to page A with a proper hreflang tag.
Challenges with Canonical Tags

The purpose of the canonical tag or rel=canonical is simple. Consider you have two pages with different URLs that have exactly the same content. Here you would place a canonical tag on one page so that Google only indexes that. Yet, there are several problems that can arise when using the canonical tag.

From what you’ve read so far, there’s no doubt that the hreflang tag is used for geo-targeting. Canonical tags are used to solve duplication issues. However, with canonical, you must have a preferred version of the web page. For you, this means spending a lot of time, effort, and likely cost. Rel=alternate hreflang tags have an advantage. They can be keyed in with ccTLDs to inhbit users from getting the notion that they need a .com. ccTLDs are more apt at achieving better results.
What the Hreflang and Canonical Test Tells Us

State of Digital put hreflang and canonical tags to the test. Here’s what their study found:

Hreflang is suited for international SEO.
When you encounter problems with duplicate content, it’s okay to combine hreflang and canonical tags.
In the absence of problems with duplicate content, you shouldn’t combine hreflang and canonical tags.
As the person in charge of international SEO, there’s no doubt that you’d be quite busy the year round. Hopefully the information gained here will be useful in steering clear of costly penalties and de-rankings from duplicate content. If you have additional international SEO tips or have unanswered questions, please feel free to use the comment box below.

How to reduce your site’s bounce rates

Published by:

Bounce rates tell you what percentage of people left a given page on your website without viewing any other pages. It’s not to be confused with exit rates, which simply tell you the percentage of visitors that left the site from a page (i.e. they may have viewed other pages first).

Also, it’s important to be aware that users could spend 10 minutes on your page before they leave the site.

In this scenario, it could well be that the page has fulfilled its purpose (or that the user has just forgotten to close it).

What do bounce rates tell you?

It’s generally used, along with other metrics, as a measure of a site’s ‘stickiness’.

For example, on SEW, I’d like people to click on a link from search, Twitter or some other referral source, find a useful article, then decide to browse further and view all of our other lovely content.

If bounce rates are high, it could mean that our content isn’t doing its job properly, though there are plenty of other possible explanations.

As a rule, I’d generally look at trends over time, and use bounce rates as one of several metrics for measuring the success of a particular page.

For example, this Google Analytics custom segment looking at the percentage of visitors viewing multiple pages provides a measure of a site’s ability to retain users’ interest beyond the page they land on.

page-depth-ga

Other measures, such as average time on page or using event tracking to see how many people read to the bottom of your posts (as described here by Justin Cutroni) can also help.

Still, the principle is important. If you’re the kind of site that wants people to stick around for a while, bounce rates provide a good general guide.

What exactly a good bounce rate is will depend on the type of site you’re running.

Working in online publishing, my experience is that bounce rates for articles can be as low as 40% and as high as 98% for individual articles. The average would vary between 70% and 85%; obviously I’m aiming for nearer 70%.

The Google Analytics screenshot below shows some of the more ‘evergreen’ articles we’ve published. As we can see, the bounce rates for such articles are lower than the average, which is nearer 80%.

bounce-rates-sew

Is a high bounce rate always a bad thing?

In a word, no. It can depend very much on the purpose of the website.

For example, people may want to quickly find a contact number or check facts. If the site enables them to find this information easily, they’ll leave quickly, thus pushing up the bounce rates.

I may need to know how old Al Pacino is (as you do). I can Google his name, click on Wikipedia, and the information is instantly available on the right of the page. Then I hit the back button.

pacino-wikipedia

Of course, I could linger longer, read more and click some of the links, but if that’s all I want to know, I’m playing a small part in increasing the site’s bounce rates.

For publishers like Search Engine Watch, we’d rather keep people on the site longer so, if someone clicks on the page, decides they’d rather not read the article in question and leaves, then that may mean we haven’t delivered on their expectations.

In the latter case, high bounce rates are a bad thing.

The relationship between bounce rates and SEO

Of course, Google doesn’t know your bounce rates, though it theoretically can find this information from the millions of sites that use Google Analytics.

In theory it would be a useful ranking factor, as it is an indication of how relevant your landing page is to the user’s search query, though allowances would have to be made for the type of site and query.

If someone wants a guide to landing page design and bounces within seconds we can assume the page hasn’t delivered. However, if they just wanted to quickly check the weather for today, then maybe it has served its purpose.

The concept of dwell time, or the ‘long click’ (as explained here by Bill Slawski) is important. It’s similar to – but not the same as – bounce rates. It’s essentially a measure of how long a user spends on a page before returning to the search results page.

Whether this is a ranking factor or not is open to debate, but it certainly makes sense in the light of Google’s search for quality signals.

In essence, it works like this:

If a user clicks through from the SERPs onto a website and then spends some time there, it suggests that the result was relevant to the query and served its purpose. In this case, Google has done its job well in ranking said website highly.

If a user clicks through and then returns to the search results page quickly (or then selects another result) then it suggests the site has not been useful for the searcher. Therefore another site may usurp it in the rankings.
Of course, this is a simplified version, and there are variables. For example, what if the site answered the query immediately (as in my Al Pacino example)?

I would assume that Google would be able to find different metrics for different types of search query so that it could take account of this.

How to reduce bounce rates or how to keep visitors on your site for longer

The following factors should help to reduce bounce rates, but also should serve to keep users on site for a longer period. Or at least remove factors which will make them leave the site.

Here we are, in no particular order…

1. Make sure your pages load quickly

No-one likes slow loading pages, so make sure yours run as fast as they can, on mobile and desktop.

The old rule of thumb from Jakob Nielsen was that users would wait two seconds for a page to load before abandoning the idea. Whatever the exact time, if a site feels slow to load, people will be thinking about bouncing.

This is important from a user experience perspective, but also forms part of Google’s mobile ranking factors.

It looks like we have some work to do on that score…

page-speed-sew

2. Give visitors all the information they may need

This is a point which perhaps applies to ecommerce more than other sites.

Let’s take an example from the travel sector. If you’re researching hotels to stay in, then the obvious destination for many web users is TripAdvisor.

There they can find (in theory) impartial views on the hotel which cut through the sales pitch on the hotel or travel agent’s website. However, once on TripAdvisor, they may be swayed by other hotels.

If you have reviews on site though, or have integrated TripAdvisor reviews on the page, then one reason to head elsewhere is removed.

Here, Best Western hotels show TripAdvisor reviews (good and bad) on hotel pages:

bw-reviews

3. Avoid clickbait

Clickbait is commonplace now. In fact, it’s hard to find a news publisher’s site without this kind of garbage following articles.

promoted-links

If you’re foolish enough to click one of these links you’ll find yourself on some of the worst sites on the web, full of pop-ups, pagination and lots of attempts to trick you into clicking on ads. There’s also the question of why publishers would want to send their audiences there, but thats an issue for another article.

Essentially, none of these posts are likely to deliver on the promise of the headline. The content needs to be relevant to the headline, or else people will bounce quickly.

I’m not against lists or using headlines to attract clicks, which is why this article has the headline it does. It’s just that headlines have to deliver on their promises.

4. Avoid huge pop-ups and annoying ads

Serving users with a huge pop-up as soon as they enter the site is a great way to make them hit the back button.

Likewise, intrusive rollover ads and autoplay audio are what make web users turn to adblockers. It will make many others bounce as soon as they reach your site.

5. Use internal linking

I’ve written about the use of internal linking as an SEO tactic, but it performs an important role in keeping people on site for longer.

Providing users with links to other interesting articles which are relevant to the one which users are reading increases the likelihood that they’ll hang around for longer, and reduces those bounce rates.

6. Be careful with external links

I’m all for giving credit when its due when it comes to links, but if you add external links early in a post and don’t open them in a new window, you’re essentially asking users to your site and inflate your bounce rates.

7. Do not use pagination

This could actually be posited as a way to reduce bounce rates, but I think the drawback of annoying users outweighs this particular aim.

People can scroll, so there’s no need to paginate. The only reason I can think of is to falsely inflate page views.

pagination-nyt

8. Site design

Users will form an opinion of your site the minute they land on it, and much of this is down to the design.

Your search result or tweet may have convinced them to click, but bad design (or at least design that doesn’t appeal to the visitor) can convince them to leave.

For example, an ecommerce site should convey a certain level of professionalism if you’re asking users to trust their credit card details to you.

This site may sell the very best gates and fences, but the design doesn’t exactly convey professionalism. It’s also very hard to read.

gatesnfences

(Hat tip to Branded3 for the example.)

9. Article formatting

This is very important. Just as people make quick judgements based on site design, they’ll also look at the article or page they clicked on and wonder how much work it will be to read and consume.

A wall of text with few paragraphs and no visual stimulus will deter many people just because it seems like hard work to digest.

wall-of-text

On the other hand, if you have clear sub-headings, bullet points, images and charts, and bold text on key stats and points then it makes even longer articles seem more appealing.

Of course, the content should deliver, but first impressions matter in this respect.

10. Mobile-friendly pages

An obvious point. If you want mobile users to stay a while on your site, then make sure it’s mobile-optimized.

11. Site search and navigation

Site search provides an easy navigation option for visitors.

On ecommerce sites, site search users often convert at higher than the average rate, as using it can indicate a greater intent to purchase.

On other sites, search provides users with an alternative way to navigate through sites, one that some web users prefer.

Give people easy and clear ways to navigate around your site. Make navigation intuitive and consistent.

12. Related content recommendations

This is about giving people ideas for other content or pages based on the article they’re reading.

We use them here on SEW, based on the main topic. You’ll see it down the page, between the author bio and the comments.

This may not be the best example of content recommendation in action, but the idea of providing content relevant to the current article is a good one.

sew-related-content

13. Most read/commented boxes

This is another form of content recommendation, based on the articles being read or shared. Here’s an example from the BBC:

most-pop

We have something similar here, a trending posts box. You may or may not have noticed it…

The point of these is that they give users further ideas for reading, whether looking at the posts with the largest number of comments, or those with most views.

14. Make calls to action clear on landing pages

You have to make it clear where customers need to go next to buy a product, retrieve a quote, or whatever action you want them to take.

Here are some general pointers:

Wording. The wording you use should make it obvious what will happen if a user presses a button, such as ‘Add to cart’ or ‘Checkout.’
Colours. Test to see which colours work best. Contrast is key. Many sites tend to go for yellow or green, but what works for one site doesn’t necessarily work for another.
Size. Make them big enough to be seen easily, but not too big.
Placement. Buttons should be placed where users’ eyes are likely to be as they scan around the page.
Adapt for different devices. Calls to action should work across various mobile devices as well as desktop.
Test. There are no right or wrong answers here. Wording, colour, shape, placement etc can all be tested to find what produces the best results.
In summary

The tips here are a mixture of methods for persuading users to stay on your site longer, and to explore further.

The latter is key to reduce bounce rates, as they need to interact with your site, but the page they land on creates that all-important first impression.

If the first page doesn’t do its job in terms of delivering relevance to the user and avoiding obvious annoyances, then there’s little chance users will want to stick around.

Also, to repeat the earlier point. Bounce rates are useful, but only used alongside other metrics like time on page and viewers of multiple pages.

Differentiation when selling a commodity

Published by:

Imagine being tasked with building a sales force that would sell the identical products as the competition, but sell those at a premium price. That was the opportunity I accepted when taking on an executive sales leadership role years ago.

In the ’90s, as Microsoft, Novell and IBM/Lotus became software powerhouses, they recognized the need to train users on their products. Without that training, there was a high risk of users being dissatisfied with the products and not purchasing future upgrade releases of them. Rather than train users themselves, they developed training channels. Individual training companies contracted with these software companies and delivered training on their behalf.

However, the software companies didn’t blindly let their channels train users. They created the course curriculum, certified the instructors on the courses, and set the PC standards for the classroom. In essence, the software companies regulated the classroom experience and marketed to users that they could attend any of the thousands of “authorized” training facilities for a fantastic learning experience.

See Also

The needs analysis questions salespeople must ask prospects
Mastering the two most powerful words in sales
Are salespeople about to become extinct?
While the software companies wanted to create a “ vanilla experience” for its clients, that was not the way their training channels approached the business.

Course prices were set all over the board. Salespeople told prospects that their training companies had better instructors and nicer PCs in the classrooms than their competitors, as a way to justify higher course prices. Again, the software companies wrote the curriculum, certified the instructors and provided the specifications for the PCs. How could it be better?

The software companies told clients that all of their authorized training companies were the same, like going to McDonald’s. Thus, prospects didn’t buy the better arguments. They bought low price. Why shouldn’t they? In the absence of differentiation, price is the ultimate decision factor. Yet, the task put in front of me was to build a sales force that could sell our courses at a premium price.

Next steps

While we were passionate that our training was superior, we could not prove it. The classroom was a commodity, but there was still an opportunity to differentiate ourselves and sell at higher prices.

We learned that IT managers had several challenges. First, when an IT manager needed to send an employee for a training course, he needed to get three to five signature approvals on the purchase order. It was the same repeated process for each employee for each course. It was a bureaucratic pain.

Second, there were a significant number of class attendees who, upon taking the course, realized that the course was not right for them or vice versa. This meant the training dollars were lost, as was the time the employee missed at work to take the course.

Finally, because of the demand for trained IT professionals, IT managers feared their trained employees would leave the company.

Given those three issues, we developed a sales strategy based on making it easier for IT managers to get the right training for their employees.

Key strategies

Our salespeople called on senior IT management executives rather than mid-level managers, who most of our competitors called. We introduced the idea of using a blanket purchase order for their training budget. Our training company billed against the blanket purchase order as employees enrolled in the courses. Interestingly, the internal process to get one blanket purchase order for a million dollars of training was the same as one for a fifteen hundred dollar course.

Before any employee was enrolled in a course, one of our education advisors interviewed the prospective student to make sure it was the training he needed and he had the background to succeed in it.

Finally, we offered a replacement guarantee. If an employee we trained left his or her company within a year of completing a course, we trained the new person free of charge.

As a result of this differentiation strategy, we became the largest, most profitable training company for Microsoft, Novell and IBM/Lotus. We sold our courses at 30 to 50 percent higher prices than our competitors. Because we had the blanket purchase order for the entire training budget, our competitors were locked out of these accounts.

While our competitors argued “better,” we positioned “different.” Differentiate yourself by solving the problems your buyers face rather than trying to differentiate commodities.

Need help with differentiation? Download my free “ Making Differentiators Matter” poster.

Lee Salz is a sales management strategist and best-selling author of “Hire Right, Higher Profits,” a top-rated sales and selling management book on Amazon. Salz specializes in helping companies build sales forces through effective hiring, onboarding, managing and compensating salespeople. He is the founder and CEO of Sales Architects, Business Expert Webinars, and The Revenue Accelerator. He is a speaker and a results-driven sales management consultant. Salz can be reached at 763-416-4321.

How to write meta tag titles

Published by:

For SEO title tag is quite important. It is always a torouple also to find an apropriate title tag that looks good for Search Engine as well as for the customer who sees the title tag in the search results and is prompted to click.

We all know that idealy your title should be 50-60 characters long.

and a title should contain a keyword or a phrase with a keyword. Some people say that the name of the company should come last and your main keyword should come first.

The way I write my titles are as follows:

Primary Keyword – Secondary Keyword | Company Name

However I have seen allot of titles that are longer than 60 characters and contain a bunch of keywords.

My question is, is it better to have titles of 50-60 characters or would it be more beneficial to have longer titles with more keywords?

If you had a longer title than 60 characters with a bunch of keywords stuffed in would it improve your rankings or help you rank for more phrases?

The only benefit of sticking to the 50-60 character standard that I see is that it is easier to read and looks nicer in search results.

Are Panda and Penguin really penalties from google

Published by:

Article written by By Marie Haynes and published in Search Engine watch here it is presented with some minor changes

Most people refer Penguin and Panda as panelists from Google, however Google is quite adamant that we should not be calling these algorithmic changes penalties.

John Mueller of Google Webmaster Help says in his hangout that these algorithms are NOT penalties, he says “From our point of view, Penguin isn’t a penalty. It’s essentially a search quality algorithm… a penalty is something that is done manually”.

When asked a question about recovering from a ‘Penguin penalty’, John’s answer was: “We see Penguin as an algorithm. It’s not something we’d see as a penalty… It’s not something that’s either on or off. It’s something where we look at the signals that we have and we try to find the right way to adjust for that.”

What is a Google penalty?

If your site has a manual penalty, you will see evidence of this in your Google Search Console (formerly called ‘Webmaster Tools’). To see if you have a manual penalty, go to Search Traffic → Manual Actions. You’ll either see a penalty like this:

Or, if you have no penalty you will see this:

no-manual-action

Important Note: You will not always see evidence of a penalty in the ‘messages’ section of the Google Search Console (Webmaster Tools). If you were added to Google Search Console for this site before the site was penalized, then you should see a message that looks something like this:

unnatural-links-message

However, if you became an owner or restricted owner after the penalty message was initially received, there will be no penalty message for you to see in the messages section. In this case, you will still be able to see the penalty in the Manual Actions Viewer though. Hopefully this is something that Google will change in the future. It would be quite helpful to be able to see the past site messages for a newly verified owner.

If you have a manual penalty, once you have cleaned up your site you can file for reconsideration. If you have done a thorough job, then a Google employee will manually remove your penalty. Something that changed in 2013 was that only sites that had a manual action could apply for reconsideration. Prior to this, anyone could file a reconsideration request, even if there was no manual penalty. For sites that were only affected algorithmically, you would get an automated response back telling you that there was no manual penalty. But now, you can’t file for reconsideration unless you actually have a manual penalty.

What is an Algorithmic Filter?

Google’s algorithms are immensely complex. There are parts of the algorithm that are constantly evaluating websites and modifying their rank depending on what they see. For example, Google’s keyword stuffing algorithm re-evaluates your site each time that Google crawls it. There are other parts of the algorithm, however, that we call filters. Filters are modifications that only take effect when Google decides to run them. Penguin and Panda are filters.

If either of these algorithms determines that your website is not a high quality site (or does not have high quality backlinks, in the case of Penguin), then Google will adjust the algorithm so that your site does not rank as well. If you have lots of issues, you can be affected severely. If you have just a few issues, you may see just a minor rank deduction.

I look at Penguin and Panda as if they were like sandbags that are holding a hot air balloon down. A site with serious issues can have very heavy sandbags that pull the site down and make it almost impossible for the site to rise in rankings unless those sandbags are removed. A site with minor issues might have lighter sandbags applied. These smaller weights still pull the site down somewhat, but not as severely.

Here are some things that separate manual penalties from algorithmic filters:

A manual penalty is manually applied by a member of Google’s webspam team. An algorithmic filter is an automatic thing.
You can’t file for reconsideration to get an algorithmic filter removed.
With a manual penalty, once you’ve cleaned up, and successfully requested reconsideration, the penalty is lifted. With an algorithmic filter, you need to improve your site and then wait for the algorithm (Penguin or Panda) to either update or refresh and reassess your site.
A manual penalty is either on or off. There can be cases where a severe penalty can be downgraded to a less severe one such as having a sitewide unnatural links penalty downgraded to a partial but in general a manual penalty is either there or it’s not. But, with an algorithmic filter, you can be affected to different degrees. Not all algorithmic hits are drastic.
There is no way of telling whether you are being demoted by an algorithmic filter. Google employees have a console where they can see whether a site is being affected by Panda or Penguin, but webmasters can’t see this. Oh how I wish Google would allow us to see whether we are dealing with an algorithmic filter! If you can see a drop in traffic that coincides the the date of a known or suspected refresh or update of Panda or Penguin, then this is a good hint that you are dealing with one of these issues. However, not all refreshes are announced. And, in the future, Google plans to incorporate both of these algorithms into the main algorithm so it is going to be hard to determine what needs to be done in order to see recovery.
With an algorithmic filter, there is no way of knowing whether you’ve done enough work to escape the filter once it re-runs. If you clean up your backlinks, Penguin refreshes and you see a mild improvement, there’s no way of knowing whether you would have seen more improvement if you had removed or disavowed more links. You can’t tell whether you still have a mild case of Penguin or whether the site is completely free of algorithmic sandbags holding it down. Similarly, Google doesn’t tell you what type of on-site quality issues they want to see cleaned up for Panda. We take our best guess when doing a Panda cleanup, but if Google is taking issue with something that we haven’t addressed and is still suppressing the rankings for that site, there’s no way to know.
Should We Be Calling Panda and Penguin Penalties?

Do a Google search for “Panda penalty” or “Penguin penalty” and you’ll see some well known SEO professionals using this terminology. Is it wrong to do so? In my mind it’s all semantics. If you want to sound like someone who really understands Google’s algorithms, it’s probably best to refer to Panda and Penguin as algorithmic filters rather than penalties. But, when I’m talking to a small business owner who has had their revenue severely cut because they’re stuck under an algorithmic filter, I certainly don’t correct them when they say they are being penalized.

I feel that Google has done a good job at cleaning up the search results for the most part. When someone searches for information on car insurance, they’re not likely to see some scuzzy buy-cheap-car-insurance-online-now.biz site that got to the top of Google by manipulating the PageRank flowing to the site. As a user, I generally am getting better results now than I did a few years ago. It’s good for Google to show the most relevant results possible. But, these filters are causing so many businesses to suffer severely. Some made poor decisions in hiring a low quality SEO to build links to their site. Others don’t even know what they did wrong, but are the victim of site quality issues that perhaps are caused by a faulty CMS. In my opinion, there needs to be a better way for sites like this to be able to recover.

What Do You Think?

Have you been negatively affected by Panda or Penguin? Do you think we should be calling these penalties?

Selling To Customers Through ‘Shoppable Videos’

Published by:

Everyone knows online video can be a great way to market businesses and products, but some businesses are finding that it can be quite beneficial for actually selling products. “Shoppable video” is a trend that has been slowly rising for several years, but new capabilities from a variety of platforms indicate that it could be poised to become much bigger.

Is video already a part of your marketing strategy? Is it part of your selling strategy? Tell us about your efforts in the comments.

“Retail video brings merchants’ products to life in a way that only e-commerce video can, often resulting in higher customer satisfaction and higher retail sales conversion,” says video marketing news blog ReelSEO.

YOUTUBE FOR SHOPPING

Greg Jarboe writes on the site that YouTube Shopping is the new window shopping and that “unlike the mall, YouTube never, ever sleeps.” He cites data directly from Google claiming that one third of all shopping searches happen between the hours of 10PM and 4AM.

A couple months ago, Google announced that it is extending its product listing ads (PLAs) to YouTube with TrueView for Shopping, its new format that lets businesses run product ads with related videos.

“Whether it’s watching a product review or learning how to bake a soufflé, we look to video in countless moments throughout to the day to help us get things done,” Google said in a blog post. “We call these micro-moments – when we reflexively turn to our devices to learn more, make a decision, or purchase a product.”

It said it launched TrueView for shopping to “connect the dots between the moment a person watches a video and the moment they decide to make a purchase,” while also making it easy for viewers to get more info on the business’ products with the option to click to buy. (view image)

With these ads, businesses can showcase product details and images, and users can click and purchase from a brand or retail site from within the video ad. The option is available for TrueView in-stream video ads, and works across mobile, desktop, and tablet. 50% of views on YouTube come from mobile.

The ads are integrated with Google Merchant Center, so you can connect campaigns with a Merchant Center feed to dynamically add products and customize ads through contextual and audience signals such as geography and demographic information. (view image)

“Brands that have participated in our early tests of TrueView for shopping have seen strong results for driving interest and sales,” Google noted in the announcement. “Online home goods retailer Wayfair, for instance, saw a 3X revenue increase per impression served when compared to previous campaigns. And beauty retailer Sephora took advantage of this new ad format to drive +80% lift in consideration and +54% lift in ad recall, and an average view time of nearly two minutes.”

Continue reading

Most Important SEO Factors For a Good Website

Published by:

If you have your own website, you must want to get more and more visitors for your website and must fight for getting top rank in search engine like Google, Yahoo and MSN. When you get more and more visitors, your earning from the website will also goes on increasing. But there are certain crucial techniques by which you can place your website at high in search engine result pages. If you are aware of these factors, your website will play a crucial role in SEO world. To help every website owner, here are some important and most influential SEO factors that will build a successful presence in the search engine result pages. When you put them into your website, you will find a dramatic improvement in Google rankings.

USE KEYWORD ANYWHERE IN YOUR TITLE TAG: Keyword research is one of the crucial factors in search engine ranking. So, start with a good keyword research. Each page must have a unique title tag which describes what the page is exactly about. The title tag of the website must be easy to read and must be well written to bring more traffic. The keyword must be placed into the title tag, so that the readers can easily understand what the web page is about, but keyword should not be keyword stuff. If your website is for your local customer, then must include locality in your title tag. Concentrate on keyword that will help you for good ranking is google search engine and try to place this keyword at the beginning of the title tag.
USE KEYWORD FOCUSED ANCHOR TEST FROM EXTERNAL LINKS: Inbound links are crucial from other domains. These inbound links with your keyword as an anchor text are much better. They have great importance in SEO.
EXTERNAL LINKS RELEVANCY: Quantity as well as quality of the external links matter greatly. Links work as a vote and more links mean more votes, more links otherwise mean your page will get more popularity in Google ranking. Three relevant links from most popular websites is much better than 40 links from unrelated low-ranked sites.
UNIQUE AND FRESH CONTENT: Content is the king for all websites. If your content is good enough, this will read by more and more readers, this means that you are achieving more and more traffics. More traffic means your website is ranking good in search engine. Duplicate content lowers your web page rank. For good ranking in search engine add content on regular basis. Search engine works as a hungry children and hunger for good and fresh content with ample of information. So, add fresh and information rich content and by adding of viral aspect and plenty of best quality incoming links and you will get yourself an authoritative page and must rank well.
USE KEYWORD IN DOMAIN NAME: Add your keyword in your domain name which will give you a great SEO rank and ease your competition. New content for a domain which already has a strong presence in internet must value more by search engines like Google, Yahoo and MSN.

USE KEYWORD IN THE BODY TEXT: Use keyword 2-5% in your article and this repetition of keyword(s) throughout the content tells Google about the page and how relevant is the keyword for a particular search term, but do not overdo it. For good impression of the content bold the keyword(s).

Other important factors which help a website in good ranking are-
• Diversity of your link sources;
• Use keyword in H1 header tag;
• For internal links keyword in the anchor text;
• Use keyword in the page URL;
• Existence of the Meta description tag etc.

SEO – what to do to keep up for the rest of 2015

Published by:

Search Engine optimization has been more tiring than 2015, it has constantly been changing, the change from desktop to responsive mobile web design, the algorithm changes etc have already made the webmasters do lot more than any year than the year 2015.

To keeep up to the pace for remainder of year, you must take an all-inclusive approach. Today’s SEO can be broken down into three key areas: 1. Technical aspects 2. User Experience, and 3. The Content of the pages. Following is a run down of each of these.

Technical SEO

Tweaking the code to make it more Search Engine friendly is very important, Performing strong keyword research is key for this step, because those keywords will be integrated into all aspects of your website from meta descriptions to title tags. Technical SEO means making sure each and every page on your website has a clear description and gets read by Google correctly. Important elements include:

– Proper Keyword Usage: Keywords should be used (but only 1-2 times) in the following: URLs, meta descriptions, page titles, website text, ALT attribute, canonical tags and H1 tags.

– Website Errors: There should be no 404 or server errors on the website, nor should there be any duplicate content.

User Experience

User experience is key when it comes to SEO. Your website, which should be generated by designers and content creators together, needs to be easy to navigate and use. What do you want your user to do? Contact you? Purchase something? This is sometimes called “inbound marketing”, but it’s important to keep UX in mind when designing (or restructuring your website). Important elements include:

– Website Structure: Does the website structure make sense? Your site must also be optimized for mobile (or you may get docked by Google).

– Website Speed: Slow websites are bad from an SEO and UX perspective. Make sure all images are optimized and compressed and clean up any excess JS or CSS on the site.

The Content

Content is now dubbed “fat” and “thin” content, Google distinguishes between types of content on a website and social media. First, the content on your site must be “fat”, which means eBooks, white papers, strong, shareable blogs. You don’t want “thin” content which is poor guest posts and outdated text. This goes for social media as well. Your social networks all go towards your SEO, so choose what networks you want to focus on and make them great.

Another part of content is public relations. Some call this “the new link building”, but you want strong content being written about your company, linking to your site as well. These inbound links can be from press releases or stories about your company, but the stronger the source (aka where the post comes from), the stronger your website optimization will be.

SEO tactics change frequently, but these tactics will get you through the rest of this year. If you haven’t thought about some of these elements, then make sure you do before we see another Google algorithm update.