Get More Customers To My Website
Rated 4.7/5 based on 197 reviews

Monthly Archives: June 2015

Thumbnail for 31473

Big Data, Big Problems: 4 Major Link Indexes Compared

By | Uncategorized | No Comments

By russangular

Posted by russangular

Given this blog’s readership, chances are good you will spend some time this week looking at backlinks in one of the growing number of link data tools. We know backlinks continue to be one of, if not the most important
parts of Google’s ranking algorithm. We tend to take these link data sets at face value, though, in part because they are all we have. But when your rankings are on the line, is there a better way to get at which data set is the best? How should we go
about assessing these different link indexes like
Moz,
Majestic, Ahrefs and SEMrush for quality? Historically, there have been 4 common approaches to this question of index quality…

  • Breadth: We might choose to look at the number of linking root domains any given service reports. We know
    that referring domains correlates strongly with search rankings, so it makes sense to judge a link index by how many unique domains it has
    discovered and indexed.
  • Depth: We also might choose to look at how deep the web has been crawled, looking more at the total number of URLs
    in the index, rather than the diversity of referring domains.
  • Link Overlap: A more sophisticated approach might count the number of links an index has in common with Google Webmaster
    Tools.
  • Freshness: Finally, we might choose to look at the freshness of the index. What percentage of links in the index are
    still live?

There are a number of really good studies (some newer than others) using these techniques that are worth checking out when you get a chance:

  • BuiltVisible analysis of Moz, Majestic, GWT, Ahrefs and Search Metrics
  • SEOBook comparison of Moz, Majestic, Ahrefs, and Ayima
  • MatthewWoodward
    study of Ahrefs, Majestic, Moz, Raven and SEO Spyglass
  • Marketing Signals analysis of Moz, Majestic, Ahrefs, and GWT
  • RankAbove comparison of Moz, Majestic, Ahrefs and Link Research Tools
  • StoneTemple study of Moz and Majestic

While these are all excellent at addressing the methodologies above, there is a particular limitation with all of them. They miss one of the
most important metrics we need to determine the value of a link index: proportional representation to Google’s link graph
. So here at Angular Marketing, we decided to take a closer look.

Proportional representation to Google Search Console data

So, why is it important to determine proportional representation? Many of the most important and valued metrics we use are built on proportional
models. PageRank, MozRank, CitationFlow and Ahrefs Rank are proportional in nature. The score of any one URL in the data set is relative to the
other URLs in the data set. If the data set is biased, the results are biased.

A Visualization

Link graphs are biased by their crawl prioritization. Because there is no full representation of the Internet, every link graph, even Google’s,
is a biased sample of the web. Imagine for a second that the picture below is of the web. Each dot represents a page on the Internet,
and the dots surrounded by green represent a fictitious index by Google of certain sections of the web.

Of course, Google isn’t the only organization that crawls the web. Other organizations like Moz,
Majestic, Ahrefs, and SEMrush
have their own crawl prioritizations which result in different link indexes.

In the example above, you can see different link providers trying to index the web like Google. Link data provider 1 (purple) does a good job
of building a model that is similar to Google. It isn’t very big, but it is proportional. Link data provider 2 (blue) has a much larger index,
and likely has more links in common with Google that link data provider 1, but it is highly disproportional. So, how would we go about measuring
this proportionality? And which data set is the most proportional to Google?

Methodology

The first step is to determine a measurement of relativity for analysis. Google doesn’t give us very much information about their link graph.
All we have is what is in Google Search Console. The best source we can use is referring domain counts. In particular, we want to look at
what we call
referring domain link pairs. A referring domain link pair would be something like ask.com->mlb.com: 9,444 which means
that ask.com links to mlb.com 9,444 times.

Steps

  1. Determine the root linking domain pairs and values to 100+ sites in Google Search Console
  2. Determine the same for Ahrefs, Moz, Majestic Fresh, Majestic Historic, SEMrush
  3. Compare the referring domain link pairs of each data set to Google, assuming a
    Poisson Distribution
  4. Run simulations of each data set’s performance against each other (ie: Moz vs Maj, Ahrefs vs SEMrush, Moz vs SEMrush, et al.)
  5. Analyze the results

Results

When placed head-to-head, there seem to be some clear winners at first glance. In head-to-head, Moz edges out Ahrefs, but across the board, Moz and Ahrefs fare quite evenly. Moz, Ahrefs and SEMrush seem to be far better than Majestic Fresh and Majestic Historic. Is that really the case? And why?

It turns out there is an inversely proportional relationship between index size and proportional relevancy. This might seem counterintuitive,
shouldn’t the bigger indexes be closer to Google? Not Exactly.

What does this mean?

Each organization has to create a crawl prioritization strategy. When you discover millions of links, you have to prioritize which ones you
might crawl next. Google has a crawl prioritization, so does Moz, Majestic, Ahrefs and SEMrush. There are lots of different things you might
choose to prioritize…

  • You might prioritize link discovery. If you want to build a very large index, you could prioritize crawling pages on sites that
    have historically provided new links.
  • You might prioritize content uniqueness. If you want to build a search engine, you might prioritize finding pages that are unlike
    any you have seen before. You could choose to crawl domains that historically provide unique data and little duplicate content.
  • You might prioritize content freshness. If you want to keep your search engine recent, you might prioritize crawling pages that
    change frequently.
  • You might prioritize content value, crawling the most important URLs first based on the number of inbound links to that page.

Chances are, an organization’s crawl priority will blend some of these features, but it’s difficult to design one exactly like Google. Imagine
for a moment that instead of crawling the web, you want to climb a tree. You have to come up with a tree climbing strategy.

  • You decide to climb the longest branch you see at each intersection.
  • One friend of yours decides to climb the first new branch he reaches, regardless of how long it is.
  • Your other friend decides to climb the first new branch she reaches only if she sees another branch coming off of it.

Despite having different climb strategies, everyone chooses the same first branch, and everyone chooses the same second branch. There are only
so many different options early on.

But as the climbers go further and further along, their choices eventually produce differing results. This is exactly the same for web crawlers
like Google, Moz, Majestic, Ahrefs and SEMrush. The bigger the crawl, the more the crawl prioritization will cause disparities. This is not a
deficiency; this is just the nature of the beast. However, we aren’t completely lost. Once we know how index size is related to disparity, we
can make some inferences about how similar a crawl priority may be to Google.

Unfortunately, we have to be careful in our conclusions. We only have a few data points with which to work, so it is very difficult to be
certain regarding this part of the analysis. In particular, it seems strange that Majestic would get better relative to its index size as it grows,
unless Google holds on to old data (which might be an important discovery in and of itself). It is most likely that at this point we can’t make
this level of conclusion.

So what do we do?

Let’s say you have a list of domains or URLs for which you would like to know their relative values. Your process might look something like
this…

  • Check Open Site Explorer to see if all URLs are in their index. If so, you are looking metrics most likely to be proportional to Google’s link graph.
  • If any of the links do not occur in the index, move to Ahrefs and use their Ahrefs ranking if all you need is a single PageRank-like metric.
  • If any of the links are missing from Ahrefs’s index, or you need something related to trust, move on to Majestic Fresh.
  • Finally, use Majestic Historic for (by leaps and bounds) the largest coverage available.

It is important to point out that the likelihood that all the URLs you want to check are in a single index increases as the accuracy of the metric
decreases. Considering the size of Majestic’s data, you can’t ignore them because you are less likely to get null value answers from their data than
the others. If anything rings true, it is that once again it makes sense to get data
from as many sources as possible. You won’t
get the most proportional data without Moz, the broadest data without Majestic, or everything in-between without Ahrefs.

What about SEMrush? They are making progress, but they don’t publish any relative statistics that would be useful in this particular
case. Maybe we can hope to see more from them soon given their already promising index!

Recommendations for the link graphing industry

All we hear about these days is big data; we almost never hear about good data. I know that the teams at Moz,
Majestic, Ahrefs, SEMrush and others are interested in mimicking Google, but I would love to see some organization stand up against the
allure of
more data in favor of better data—data more like Google’s. It could begin with testing various crawl strategies to see if they produce
a result more similar to that of data shared in Google Search Console. Having the most Google-like data is certainly a crown worth winning.

Credits

Thanks to Diana Carter at Angular for assistance with data acquisition and Andrew Cron with statistical analysis. Thanks also to the representatives from Moz, Majestic, Ahrefs, and SEMrush for answering questions about their indices.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source::

      

SearchCap: Bing Powers AOL, Uber To Bing Maps & Google+ Gone From Google Search

By | Uncategorized | No Comments

By Barry Schwartz

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry

Local & Maps

Link Building

Searching

SEO

SEM / Paid Search

The post SearchCap: Bing Powers AOL, Uber To Bing Maps & Google+ Gone From Google Search appeared first on Search Engine Land.

Source::

      

SPONSOR MESSAGE: Attribution Modeling for Data-Driven Marketers

By | Uncategorized | No Comments

By Search Engine Land

Although marketing teams use multiple channels to drive sales, the majority of marketers still employ single-touch attribution models. These increasingly outdated models arenít up to the task of tracking which campaigns have the biggest impact on performance and which channels or tactics deserve credit for each conversion. The findings in this report from AdRoll should provide guidance on how to measure your campaign performance in the long and short term. Get your copy now.

The post SPONSOR MESSAGE: Attribution Modeling for Data-Driven Marketers appeared first on Search Engine Land.

Source::

      

Thumbnail for 31425

Google+ Brand Posts Have Been Stripped From Knowledge Graph Cards

By | Uncategorized | No Comments

By Greg Finn

This week Google’s Knowledge Graph cards became a little less social. The Knowledge Cards had previously displayed recent Google+ posts of many brands, something that was discontinued last week.

According to a Google spokesperson the Google+ posts were removed from Knowledge Graph cards in search results in order to provide more consistency. Google+ supporters shouldn’t be disheartened though, Google+ posts will appear within the search results page – just not the Knowledge Cards. This places the Google+ posts along with tweets and other publicly crawlable links in the same fashion as any other social network. It was confirmed that this change officially went live last week.

Salesforce-Old

Google+ Posts Previously Showing

Salesforce-Now

Google+ Posts No Longer Showing

The change of integrating Google+ similarly to other networks has been in place for awhile now. In January Google began tying other Social profiles into the Knowledge Graph cards and earlier this year Google decreased promotion of the service.

The Google+ posts in the Knowledge Graph were still a boon for many marketers who were active on Google+. Ad extensions and Google+ follower information is still displaying for AdWords ads, this change is for the Knowledge Graph only.

Hat/tip to SEMPost .

The post Google+ Brand Posts Have Been Stripped From Knowledge Graph Cards appeared first on Search Engine Land.

Source::

      

Thumbnail for 31422

SEO Can’t Always Get What It Wants — Or Can It?

By | Uncategorized | No Comments

By Erin Everhart

Everyone can, and probably has, argued over what part of working in SEO is the hardest. From the frequent algorithm updates and never really knowing what Google is thinking to constantly explaining yourself to executives and fighting tooth and nail to correct our bad reputation, we have plenty of options to choose from.

Personally, my nomination for one of the most difficult challenges is managing the push and pull within organizations to ensure SEO gets the resources it needs to achieve results.

As an SEO, you don’t really “own” any one digital asset, but everything in digital has an impact on your organic search traffic. It’s a disturbing situation because when something changes — even if you have nothing to do with it and perhaps don’t even know about it — you’re still on the hook when your organic traffic tanks.

So, how do you work with other teams to get what you want?

First, a few generalities that apply to anyone you’re working with:

  1. Speak their language. Throwing out acronyms and industry jargon is going to leave your listeners confused.
  2. Compromise. Don’t come in guns blazing demanding it’s your way or no way. Being a good business partner requires a bit of give and take so people actually want to work with you again.
  3. Talk on their terms. The easiest way to get what you want is to show how it’s actually going to benefit the other person. Focus on how whatever change will impact their KPIs, not just organic traffic.

Now that we’ve got the basics covered, how can SEOs work with specific departments so everyone gets what they want?

C-Suite

Company adoption of SEO ultimately comes from the top down; if your C-suite is on board and understands the value, you’ll have an easier time working in the weeds with the people who actually make the changes.

To do that, prove the value of SEO without drowning them in data. We have tons of metrics at our beck and call, but, in most cases, the only numbers C-suites really care about are traffic and revenue. Focus on overall business impact, not just how what you’re proposing affects organic traffic and revenue.

You’ll make even more impact if you can show competitors gaining more market share, because no C-suite wants to be second place to direct competition.

UX/Design Department

Designers don’t want to compromise their design for SEO, and they don’t want to design something solely for search engines. And rightfully so. Your site should first and foremost serve the needs of your users, but too often we’re forgetting that search engines are primary users of your site — probably the biggest ones in terms of how many times they access your site.

Each time you speak with your design team, watch your wording: Avoid things like “designing for SEO” or “building it for bots” because you’re only perpetuating the stereotype that SEOs don’t care about users.

The likeliest chance of conflict comes over content. Everyone knows you need live text on your web pages if you expect them to rank, but the design argument is that users don’t read content, and all it does is push down the important stuff (images, products, CTAs) that’s aimed at spurring people to make a purchasing decision.

The most popular compromise is the eyesore of a content block at the bottom of a page.

Sure, great for SEO, but this is useless for everyone else.

Yeah, I’m sure everyone is reading this 11px sized font.

Sure, that content block is great for SEO, but it’s a sub-par user experience, and it’s definitely not the only way to rank for competitive terms. There are plenty of companies doing good design that have great SEO without that damned content block:

  • Otterbox ranks for “iPhone cases”
  • Target ranks for “bathing suits”
  • Best Buy ranks for “digital cameras”

The point is there are plenty of ways to to have well-designed engaging site that kicks butt in search engines, using things like web fonts, expandable divs, mouse-overs on images to show content, and small chunks of content scattered throughout the page rather than one large block at the end.

There’s no silver bullet solution, and what works for one site may not work for yours. Thankfully, designers love testing even more than SEOs, so approach your suggested changes not like, “This is what we have to do,” but more like, “Hey, I think this could help; let’s see how our users and search engines react.”

Present a couple of design options, put them out in the wild for six to eight weeks, and see what improves your positioning most while also increasing your overall engagement.

Copywriters

Whether you believe all SEOs should know how to write by themselves, or you rely on external copywriters, we all know SEO can’t exist without content. (Remember, content doesn’t have to be blog posts or marketing copy. Title tags and meta descriptions, two things which SEOs historically “own,” are pretty important content pieces for SEO, too.)

Copywriters are pretty much the sorcerers of today’s digital landscape as most everything that exists online includes some form of written content. It’s a primary driver for search engine rankings, and it’s the number one way users interact with brands, whether that involves content in emails, social posts, articles or product descriptions.

Copywriters are also always looking for things to write about, and that’s exactly where SEO steps in. SEOs have a pulse on what users are searching for and should be steering the content topics. That lifts some of the burden off the copywriters in coming up with the ideas, while also providing new organic entry points across your website.

Development & IT Teams

There are obvious elements that make for good SEO (like design and content), but there are even more nuances when you pull back the curtain and look at a site’s foundation. If your site isn’t built correctly, no amount of good design and quality content will bring you organic search visibility. Your developers are your lifelines, and you need to make sure you’re their favorite SEO.

This is the one team where speaking their language makes the most impact. Whether you’re working with network support or programmers, you’re interacting with highly specialized and highly technical people. If you’re not familiar with how websites are built and don’t understand the relevant jargon, you’ll get lost in their conversation.

You don’t have to physically know how to do it, but you better know how to clearly explain it. This makes a huge impact when you’re requesting work or putting in a JIRA story. Are all the requirements there? Did you note specifically where on the site you need the change to go? Will they be able to pick up the story and successfully complete the task without having to track you down for more information?

Even with the SEO and digital landscape changing every day, I don’t think there will ever be a time when we — not just SEOs, but anyone working in digital — can do our jobs in a silo. We’ll always have to rely on other teams to meet our KPIs. So, what’s worked for you? How have you been able to interact with other team members to do what’s best for the business and what’s best for SEO?

The post SEO Can’t Always Get What It Wants — Or Can It? appeared first on Search Engine Land.

Source::

      

Thumbnail for 31413

Is Local Directory Traffic On The Rise Again?

By | Uncategorized | No Comments

By Myles Anderson

I keep a close eye on the health of the local online directory market. It’s an important part of the local data ecosystem, and it’s central to the citation work done by us (at my company, BrightLocal) and by many thousands of local search marketers.

I’ve been monitoring the traffic fortunes of a number of directory sites since 2011, and the charts and analysis below are a follow-up to similar data I shared last year.

The data is taken from Quantcast, and the figure used is their U.S. “People per Month” data, which they describe as “[t]he estimated number of people assessing a property from online and mobile web in aggregate.”

Note: The figures provided by Quantcast are estimates and will differ from other tracking sources. The data is best used to view trends and to compare volumes and trends between sites. The data is for the U.S. — U.S. directories and U.S. users.

We studied 30 of the most prominent, well known, and high traffic U.S. directories including Yelp, Whitepages, YP, MapQuest and 26 others.

The overall picture remains one of long-term decline for the local online directory market as a whole, with the exceptions of a couple of directories bucking the downward trend. However, in the last two months, the aggregate traffic to directories appears to have bounced and is climbing again.

Most Sites In Decline But Yelp Remains Constant

All Sites vs Yelp - 2013-2015 Visits

Over the past 28 months, there has been a 35% decline in traffic to the top online directories. We’ve separated Yelp from the rest of the group because its traffic is so vast in comparison that combining it with the others skews the view of what’s happening in the industry at large.

In fact, we can see that Yelp’s visit numbers (approx. 80 million/month) are almost the same as the other 29 sites combined! This really shows the impact of Yelp’s investment in its service.

Yelp has grown and nurtured a loyal review-writing audience, which has allowed it to build the most comprehensive set of online reviews for local businesses. In turn, it has secured significant distribution deals with Yahoo and Apple Maps. This, along with establishing itself as a household name brand, has meant its user numbers have remained high and growing while those around it flounder.

Are Bigger Brand Directories Doing Any Better?

We know that Google likes brands and uses brand as a way of distinguishing reputable sites and businesses from others with similar content but less authority. Yelp certainly benefits from this.

Among the 30 directories we examined, there are some other well-known(ish) brands; we’ve dubbed these the “Big 12.” So, are they faring any better than the smaller, lesser known brands?

The following chart looks at % change in visits from May 2014 to May 2015.

Percentage Change - Visits - Last 12 months

So, it appears the answer is, “No, not really!”

In fact, the bigger directories have lost a greater percentage of their traffic (18%) than the smaller directories (13%). Over the same period, Yelp saw a 6% uplift in traffic.

So, Is The Future Bleak For Local Directories?

If we focus in on the first five months of 2015 (Jan-May), the story actually looks a little more rosy for the beleaguered directories. The chart below shows a fall from Jan-March, which is reversed in April and May.

All Sites vs Yelp - Jan 2015 - May 2015

If we split out the Big 12 directories from the smaller sites, then we can see that the Big 12 have benefited the most from this revival of fortunes, while the smaller sites have not.

Percentage Change - Visits - Last 5 months

Who Are The Big Winners & Losers?

If we look more closely at the Big 12 sites, we can see that over the last 12 months, there have been some clear winners and losers.

Big 12 - Percentage Change - Last 12 months

The Better Business Bureau website (BBB.org) has seen significant growth in visits, up 70% from May 2014 to 10m visits/month.

Yelp and Whitepages.com have also seen some growth, but only fractionally.

Why Is BBB.org Bucking The Trend?

Technically, BBB isn’t a directory like the others. It is a group of local organizations that champion trust in local businesses and offer accreditation for businesses — which helps consumers know if a business is trustworthy and decent versus low quality and to be avoided.

But the site does provide listings of businesses, publishes consumer reviews about businesses, and earns revenue from businesses who get accredited. It’s also a site than many search marketers like to build citations on, so we’ve included it in the data.

I believe there are two key things that set BBB.org apart from normal directories:

  • Unique Review Content. The site is building a decent range of reviews, which provides unique content for Google to hook into.
  • High Trust Factor. Google likes sites it can trust, and BBB is built to help consumers make better choices by disclosing the truth about a business. But it’s not just about Google. As more consumers become aware of BBB.org, they will start to rely on it more when purchasing from local businesses.

BBBorg Growth in Traffic

What Does The Next 12 Months Hold?

It’s hard to predict anything except for more decline; certainly, that appears to be the outlook for smaller directories. Even as I type, the effects of Google’s May “Doorway” algorithm update may be having an effect on these sites. The exact result/evidence of this is still unclear, and I point you in the direction of Andrew Shotland’s blog to read more about this.

This decline in fortunes will likely lead to closures or mergers of some sites as their business models no longer deliver enough revenue to make them worthwhile ventures for their owners.

For larger names in the industry, the game is shifting. They’re now becoming marketing service businesses building out a wide array of services to sell to local businesses — everything from website building to PPC to reputation management. Their directories offer them a source of leads to sell these services to — so they’ll maintain them, but they won’t be the core of the businesses anymore.

The post Is Local Directory Traffic On The Rise Again? appeared first on Search Engine Land.

Source::