Get More Customers To My Website
Rated 4.7/5 based on 197 reviews

Monthly Archives: August 2015

Thumbnail for 36249

Mount McKinley Becomes Mt. Denali On Google Maps; Bing Stays With Old Name

By | Uncategorized | No Comments

By Danny Sullivan

Denali in 1996, by Danny Sullivan

North America’s highest mountain has been restored to its native name of Denali by US President Barack Obama. The peak had been known as Mount McKinley since 1917.

The move has sparked some political debate, especially among Ohioan politicians who view it as a slight against Ohio native William McKinley, who was the 25th president of the United States. Alaskan politicians had been pushing for the change.

I was curious how quickly our major search engines may have changed the name on their mapping services. As it turns out, Google’s already switched over:

Google Maps Denali

On Google Maps, the peak is listed as “Mt Denali.” A search for the official name (as I understand it) of “Denali” won’t find it, but “Mount Denali,” “Mt Denali” and even “Mount McKinley” will.

On Bing Maps, it’s still the old name that appears:

Bing Denali

A search for “Mt Denali” will find the mountain but shows the Mount McKinley name. “Mount McKinley” also finds it. “Denali” brings up the town of Denali.

Both Google and Bing also provide direct answer information for places along the right side of their search results pages. The name change has yet to come to these areas. For a search on “Mount McKinley,” both still list the peak with that name rather than Denali:

google mt mckinley

bing mt mckinley

Searching for “Denali” brings up information about the mountain but still with the Mount McKinley name:

denali google

bing denali

These direct answers for both search engines draw heavily from Wikipedia. Its page about the mountain has been changed to reflect the restored name of Denali.

The post Mount McKinley Becomes Mt. Denali On Google Maps; Bing Stays With Old Name appeared first on Search Engine Land.

Source::

      

Google Accused of Rigging Search Results by India’s Competition Commission by @mattsouthern

By | Uncategorized | No Comments

By Matt Southern

Google is again being accused of favoring its own properties in search results, this time by the Competition Commission of India. According to a report in The Economic Times, the Commission is accusing Google of ranking its own websites ahead of more deserving competitors India’s Competition Commission also takes issue with the fact that Google’s paid listing appear ahead of organic listings. However, that complaint doesn’t hold as much ground since that’s more of a complaint against the principles of advertising. In addition to the Competition Commission of India, companies like Microsoft, Facebook, and Nokia’s maps division have also filed […]

The post Google Accused of Rigging Search Results by India’s Competition Commission by @mattsouthern appeared first on Search Engine Journal.

Source::

      

SearchCap: Bing Predictions Tackles The NFL, Yahoo Expands Gemini & Blocking Bad Bots

By | Uncategorized | No Comments

By Amy Gesenhues

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Link Building

Searching

SEM / Paid Search

SEO

The post SearchCap: Bing Predictions Tackles The NFL, Yahoo Expands Gemini & Blocking Bad Bots appeared first on Search Engine Land.

Source::

      

Thumbnail for 36240

Bing Predictions Tackles The NFL, Likes The Broncos, Colts, Packers & Seahawks

By | Uncategorized | No Comments

By Matt McGee

NFL fans in Denver, Indianapolis, Green Bay and Seattle might be happy to know that Bing’s prediction engine has slated their hometown teams as the top seeds in the AFC and NFC this coming season.

Those are some of the initial prognostications as Bing sets its prediction engine loose on pro football.

The playoff predictions are part of a weekly power rankings that Bing plans to publish each Tuesday at 12:00 PM PT. A search for “nfl playoff predictions” will bring up the most up-to-date predictions for the six teams in each conference that Bing thinks are on track to make the playoffs.

Those aren’t the only predictions Bing has planned for football season. There’ll be weekly game-by-game predictions that will show on searches for NFL team names, along with an explanation behind why Bing is making each game prediction. And Bing’s also going to try helping fantasy football players by predicting who the top performers will be each week. That feature can be seen now by searching “fantasy football predictions.”

bing-fantasy-football-predictions

Bing’s prediction technology uses a variety of data to make intelligent guesses about upcoming events. Beyond the sports world, it’s been used for things as serious as politics (where Bing correctly predicted 97 percent of the 2014 U.S. Senate races) and as frivolous as entertainment awards (Bing got 84 percent of its Academy Awards predictions correct) and reality TV shows (where it correctly predicted 90 percent of American Idol results).

By the way, Bing’s other predicted NFL playoff teams this season? Baltimore, New England, Houston and Miami in the AFC; Dallas, Atlanta, Philadelphia and New Orleans in the NFC. But that’s just today; as any NFL fan knows, things can change quickly — and Bing is betting that its predictions can keep up every week as the season goes along.

The post Bing Predictions Tackles The NFL, Likes The Broncos, Colts, Packers & Seahawks appeared first on Search Engine Land.

Source::

      

Driving Growth With Marketing Automation – September 9 Webcast

By | Uncategorized | No Comments

By Search Engine Land

Advancements in marketing automation have created an opportunity for agencies and marketing teams to be the lifeline for their clients. The roles for marketing teams have evolved into supporting a multitude of responsibilities including customer and prospect engagement, relationship management, data analysis and technical expertise. To manage these responsibilities, agencies and marketing teams need to take advantage of today’s marketing automation technology.

Join panelists Brandee Johnson and Paige Musto for this Digital Marketing Depot webcast and learn how marketing automation helps organizations reach out with the right message at the perfect time to reach their target audience.

Registration is free at Digital Marketing Depot.

The post Driving Growth With Marketing Automation – September 9 Webcast appeared first on Search Engine Land.

Source::

      

Thumbnail for 36235

Yahoo Follows Google In Building Out Local Search Marketing Reseller Program

By | Uncategorized | No Comments

By Greg Sterling

As Yahoo has turned Gemini into a more expansive search marketing platform, it has also expanded the ways the platform is being sold. The company is expanding and adding resellers to its Preferred Partner Program. Its latest partner is small business marketing platform ReachLocal.

Existing Preferred Partners include Marin, Acquisio and Kenshoo, among others.

For years, Google has operated an extensive reseller program for AdWords (called Preferred SMB Partners), which is intended to reach more deeply into the SMB market. The function of the Google program is to bring AdWords to SMBs that otherwise might not do self-service or would be likely to stumble or fail at campaign self-management.

Google has said that when many SMBs self-serve it sees higher churn than if partners or agencies manage those AdWords accounts on behalf of local business owners. Yahoo has been following Google’s lead in starting to build out a similar network of partners to sell and support Gemini for local businesses.

Yahoo’s search marketing inventory is separating from Bing, though it’s not entirely separate. Companies like ReachLocal will now be representing Gemini as mostly distinct inventory and traffic on the PC and mobile to thousands of small business customers.

ReachLocal indicated in an email interview that it isn’t yet selling native advertising on Yahoo as part of the arrangement but that it will be adding both PC and mobile traffic from Yahoo search to its existing paid-search advertising network.

While the small business market has long been an important and attractive (though challenging) target of major internet companies, competition has intensified, with companies like Google and Facebook seeking to be the go-to digital marketing platform for these businesses. Traditional media companies that used to serve these advertisers with their own products exclusively have effectively become agencies selling third-party traffic and inventory to SMBs.

The post Yahoo Follows Google In Building Out Local Search Marketing Reseller Program appeared first on Search Engine Land.

Source::

      

Thumbnail for 36232

India A Second Front In Google’s Antitrust Battles With Foreign Regulators

By | Uncategorized | No Comments

By Greg Sterling

While Google’s antitrust investigation in Europe has received considerable attention a similar, ongoing investigation in India has been far less well covered. However an article appearing in India-based Economic Times suggests a legal environment in the South Asian country no less challenging for Mountain View.

The article asserts, “Flipkart, Facebook, Nokia’s maps division, MakeMy-Trip.com and several other companies have corroborated complaints that US Internet giant Google abused its dominant market position, in their response to queries raised by the Competition Commission of India (CCI).”

The Indian antitrust investigation began in early 2014 and has two areas of focus: whether Google abused its position in promoting vertical results (similar to the case in Europe) and unfair competition in the administration of AdWords.

The CCI recently issued a report, which I have not had an opportunity to review, that apparently argues Google did in fact violate Indian competition law. According to the Economic Times article, Google needs to formally respond by September 10 as well as appear in person before the commission.

The CCI was established by the Indian Parliamentary Competition Act of 2002. The law seeks to protect competition in the Indian market by prohibiting anti-competitive mergers, abuse of dominant market position and anti-competitive contracts.

Google apparently cannot settle alleged antitrust claims as it can or could have in Europe and as it did in the US. The CCI is supposed to either find a violation or exonerate the company or companies in question. If Google is found in violation of Indian competition law, CCI could impose a fine of up to 10 percent of Google’s income — in other words billions of dollars.

The CCI can also seek “structural remedies” that include breaking up anti-competitive enterprises. However as a practical matter that’s not going to happen in this case.

The post India A Second Front In Google’s Antitrust Battles With Foreign Regulators appeared first on Search Engine Land.

Source::

      

Thumbnail for 36229

3 Steps To Find And Block Bad Bots

By | Uncategorized | No Comments

By Ben Goodsell

Most SEOs have heard about using Log Files to understand Googlebot behavior, but few seem to know they can be used to identify bad bots crawling your site. More and more, these bots are executing JavaScript, inflating analytics, taking resources and scraping and duplicating content.

The Incapsula 2014 bot traffic report looked at 20,000 websites (of all sizes) over a 90-day period and found that bots account for 56% of all website traffic; 29% were malicious in nature. Additional insight showed the more you build your brand, the larger a target you become.

distribution-bad-good-bot-traffic

While there are services out there that automate much more advanced techniques than what’s shown here, this article is meant to be an easy starting point (using Excel) to understand the basics of using Log Files, blocking bad bots at the server level and cleaning up Analytics reports.

1. Find Log Files

All servers keep a list of every request to the site they host. Whether a customer is using the Firefox browser or Googlebot is looking for newly created pages, all activity is recorded in a simple file.

The location of these log files depends on the type of server or host you have. Here are some details on common platforms.

  • cPanel: A common interface for Apache hosts (seen below); makes finding log files as easy as clicking a link.

log files for seo and bad bots

  • Apache: Log Files are typically found in /var/log and subdirectories; also, using the locate access.log command will quickly spot server logs.
  • IIS: Microsoft servers “logging” can be enabled and configured in the Internet Services Manager. Go to Control Panel -> Administrative Tools -> Internet Services Manager -> Select website -> Right-click then Properties -> Website tab -> Properties -> General Properties tab.

2. Identify Number Of Hits By IP & User Agents

Once files have been found, consolidate, then open in Excel (or your preferred method). Due to the size of some log files, this can often be more easily said than done. For most smaller to medium sites, using a computer with a lot of processing power should be sufficient.

Below, .log files were manually consolidated into a new .txt file using a plain text editor, then opened in Excel using text-to-columns and a “space” delimiter, with a little additional cleanup to get the column headers to line up.

consolidated log files for seo and bad bots

Find Number of Hits by IP

After consolidating and opening logs in Excel, it’s fairly easy to find the number of hits by IP.

To do this:

  1. Create a Pivot Table, look at Client IP and get counts.
  2. Copy and paste, rename column headers to Client IP and Hits, sort by descending, then finally insert a User Agent column to the right of Hits.

log files for seo and bad bots client IP pivot tables

Find User Agents By IP

As a final step in identifying potential bad bots, find which user agents are associated with IPs hitting your site the most. To do this, go back to the pivot table and simply add the User Agent to the Row Label section of the Pivot Table.

Now, finding the User Agent associated with the top-hitting IP is as simple as a text search. In this case, the IP has no declared User Agent (was from China) and hit the site over 80,000 times more than any other IP.

log files for seo and bad bots find top hitting IP no user agent

3. Block IPs From Accessing Site And Displaying In Analytics.

Now that the malicious IP has been identified, use these instructions to prevent number inflation in Analytics, then block that IP from accessing the site completely.

Blocking An IP In Analytics

Using Filters in Google Analytics, you can exclude IPs. Navigate to Admin -> Choose View (always a good idea to Create New View when making changes like this) -> Filters -> + New Filter -> Predefined -> Exclude traffic from the IP addresses -> Specify IP (regular expression).

log files for seo and bad bots exclude IP in google analytics

Tip: Google Analytics automatically blocks known crawlers identified by IAB (a $14,000 value for non-members). Just navigate to Admin -> View Settings, and under where it says “Bot Filtering,” check “Exclude all hits from known bots and spiders.” It’s always a best practice to create a new view before altering profile settings.

If you use Omniture, there are three methods to exclude data by IP.

  1. Exclude by IP. Excludes hits from up to 50 IPs.
  2. Vista Rule. For companies that need more than 50.
  3. Processing Rule. It’s possible to create a rule that prevents showing data from particular IPs.

Blocking An IP At The Server Level

Similar to identifying where the log files are located, the method of blocking IPs from accessing your site at the server level changes depending on the type of server you use.

Conclusion

Third-party solutions route all traffic through a network to identify bots (good and bad) in real time. They don’t just look at IPs and User Agent Strings, but also HTTP Headers, navigational site behavior and many other factors. Some sites are using methods like reCAPTCHA to ensure their sites visitors are human.

What other methods have you heard of that can help protect against the “rise of the bad bots?”

The post 3 Steps To Find And Block Bad Bots appeared first on Search Engine Land.

Source::

      

Thumbnail for 36226

6 Non-SEO Tools You Should Be Using For SEO

By | Uncategorized | No Comments

By Brian Patterson

When it comes to tools, I’m like Depeche Mode: I just can’t get enough. Our team is constantly on the lookout for shiny new things that will make us more efficient and better at our jobs. In fact, this is so important to us that we make yearly goals regarding how many tools we want to test and implement.

And while we tinker with every SEO-focused tool out there, we also explore tools not necessarily meant for our industry. As with anything, there are hits and misses. I’d like to share with you six of our favorite hits.

1. InSite 5

In this video, Matt Cutts hints at grammar having an impact on your rankings:

Rankings aside, visitor trust and conversions can only be improved by eliminating pesky spelling and grammar issues on your site.

That is why we really like InSite 5. It is a desktop software (PC only, sadly) that crawls your website looking for spelling and grammar errors. You can also customize the dictionary to eliminate false positives. When the crawl is done, it creates a nice PDF report that you can have someone execute on.

This is a great tool to run on a regular basis against all of the sites you are a part of.

URL: http://www.inspyder.com/products/InSite

Cost: $60

2. Attentiv

attentiv

Collaboration is vital to what we do, as we always have a team of at least five people working on a project (project management, technical SEO, content, design, link building, etc.). Attentiv makes this collaboration easy and asynchronous, with threaded commenting, polling and idea upvoting to help us to get more creative and decisive. We keep Attentiv open all day, sitting in a tab next to our email.

URL: http://attentiv.com/

Cost: Free for first 10 users. $5/user per month after that.

3. Canva

canva

Sometimes, you need a graphic quick — like, right now. And while I love our designers, they are generally working from a priority queue and are also perfectionists, so things don’t happen immediately.

If I need a great open graph image or a custom image to support a blog post, I’ll often turn to Canva to quickly put something together. I’m not a designer, I’m just one of those guys who thinks they know what does and doesn’t look good, and I’m always quite pleased with what even I’m able to do in Canva with just five minutes of work.

URL: https://www.canva.com/

Cost: Free to edit images. They also have a stock photo library that you can pull from with photos costing $1 each.

4. Infogr.am

infogram

Yes, infographics still have their place in SEO, particularly for the right data and message. Infogr.am is a non-designer-friendly infographic maker, and it can come in handy when you’re in a pinch.

However, we also use it for more than just infographics. When we’re working on creative content for use in client marketing campaigns, or when we just want to make some charts look really good for our client reports, we turn to Infogr.am. It is quick and easy to create charts, and it visually crushes any chart you’d create in MS Office.

URL: https://infogr.am

Cost: 30-day free trial. Starting at $15/month after that.

5. Cision Media Database

cision

This is one of the more expensive tools we invest in, but we’ve renewed every year because of the value it provides.

The Media Contact Database contains information about almost every news outlet and reporter out there: it has the topics they cover, all of their contact info, and all of their social media accounts. It’s a starting point in a broader relationship-building and content promotion process, but an important point at that.

URL: http://www.cision.com/us/pr-software/media-database/

Cost: We can only speak for Vocus’ Media Contact Database, which was acquired by Cision. Vocus didn’t publish the price (nor does Cision). It starts around $4,800/yr., but it is very negotiable.

6. Title Tester

title-tester

Title Tester is beautiful in its simplicity, and it’s something we use with every piece of content that goes out the door. We even tested the title of this very blog post. What you do is craft several good title options for the content you are creating.

You put those into Title Tester and it provides you with a link to share with your network. Everyone in your network can vote on their favorites, and in the end you have a semi-data-driven approach to selecting your title. It’s very fast and very effective.

URL: https://www.titletester.com/

Cost: Free if you have your own friends/family vote on your title options. You can also pay a nominal amount on the platform for their population of people to upvote.

Do you have a favorite non-SEO tool that you use for SEO? Let us know in the comments!

The post 6 Non-SEO Tools You Should Be Using For SEO appeared first on Search Engine Land.

Source::