Get More Customers To My Website
Rated 4.7/5 based on 197 reviews

Monthly Archives: November 2010

Thumbnail for 167

When Is a Blog the Right Form of Content Marketing?

By | Uncategorized | No Comments

By Isla_McKetta

Posted by Isla_McKetta

You’ve heard the wisdom:

“Your business should have a blog.”

“Blogging helps your SEO.”

“Why aren’t you blogging yet?”

According to the experts, a blog will solve all your Internet woes. Blogging will increase your traffic, expand your audience, improve your engagement, position you as an authority, and allow you to shape the message in your space.

In fact, blogging is so hyped as a panacea, you’d think that simply adding a blog to your site would also help you find the perfect spouse, cure the common cold, and even turn lead into gold.

While I won’t deny the power of a good blog on the right site (seriously, as a writer, I’m pro-blog in general) to do all of those good things and more, you should always question anything that’s touted as the right answer for everyone (and everything). So should you blog?

When a blog is NOT necessarily the right form of content marketing

Now that you’re asking whether all that time and energy you’re putting (or planning to put) into your blog is really the right investment, let’s look at a few examples of when blogging is a bad idea (or is simply unnecessary).

1. You own your market

Johnson & Johnson. Amazon. Target. Google. These companies have already captured the hearts and minds of so many consumers that their names are nearly synonymous with their products. Here’s why blogging would only offer each of them a marginal benefit.

Traffic

Does Johnson & Johnson really care about traffic to its site when you already have Band-Aids (and all their other name brand products) in your medicine cabinet? Sure, they produce infographics, but there’s no real blog, and you were going to buy their products anyway, right?

Audience reach

Ordering anything from books to pet-waste bags online? You didn’t need a blog to discover Amazon, it’s so ingrained in your Internet history that you probably went straight there and those products will be on your doorstep in two days or less.

Engagement

Target mastered engagement when Oprah and Tyra started referring to the store as Tarzhay and shoppers only got more loyal as they added designer labels at discount prices. It didn’t matter that most of their products weren’t even available on their website, let alone that they didn’t have a blog. Their site has gotten a lot better in the past decade, but they still don’t need a blog to get customers in the door.

Authority

And Google… Sure they have a blog, but Google is such an authority for search queries that most of the consumers of their search results have no interest in, or need for, the blog.
So if you have little or no competition or your business is (and you expect it to remain) the top-of-mind brand in your market, you can skip blogging.

2. You have a better way of getting customers into the top of your funnel

A blog is only one way to attract new customers. For example, I live less than a mile from the nearest grocery store, and I can get there and back with a spare stick of butter before my oven even warms up. If the next nearest store had the most amazing blog ever, I’m still not going to go there when I’m missing an ingredient. But if they send me a coupon in the mail, I might just try them out when it’s less of an emergency.

The point is that different types of businesses require different types of tactics to get customers to notice them.

My mom, a small-town accountant who knows all of her clients by name, doesn’t blog. She’s much more likely to get recommended by a neighbor than to be found on the Internet. If paid search brings you $50k in conversions every month and your blog contributes to $10k, it’s easy (and fair) to prioritize paid search. If you find that readers of white papers are the hottest leads for your SaaS company, offering a 50:1 ROI over blog readers, write those white papers. And if your customers are sharing your deals across email and/or social at a rate that your blog has never seen, give them more of what they want.

None of that means you’ll never have to create a blog. Instead, a blog might be something to reassess when your rate of growth slows in any of those channels, but if you’ve crunched your numbers and a blog just doesn’t pan out for now, use the tactics your customers are already responding to.

3. The most interesting things about your business are strictly confidential (or highly complicated)

Sure the CIA has a blog, but with posts like “CIA Unveils Portrait of Former Director Leon E. Panetta” and “CIA Reaches Deep to Feed Local Families” it reads more like a failed humanizing effort than anything you’d actually want to subscribe to (or worse, read). If you’re in a business where you can’t talk about what you do, a blog might not be for you.

For example, while a CPA who handles individual tax returns might have success blogging about tips to avoid a big tax bill at year end, a big four accounting firm that specializes in corporate audits might want to think twice about that blog. Do you really have someone on hand who has something new and interesting to say about Sarbanes Oxley and has the time to write?

The difference is engagement. So if you’re in a hush-hush or highly technical field, think about what you can reasonably write about and whether anyone is going to want (or legally be able) to publicly comment on or share what you’re writing.

Instead, you might want to take the example of Deloitte which thinks beyond the concept of your typical blog to create all kinds of interesting evergreen content. The result is a host of interesting case studies and podcasts that could have been last updated three years ago for all it matters. This puts content on your site, but it also allows you to carefully craft and vet that content before it goes live, without building any expectation associated with an editorial calendar.

4. You think “thought leadership” means rehashing the news

There is a big difference between curating information and regurgitating it. True life confession: As much as I hate the term “thought leader,” I used it many a time in my agency days as a way to encourage clients to find the best in themselves. But the truth is, most people don’t have the time, energy, or vision to really commit to becoming a thought leader.

A blog can be a huge opportunity to showcase your company’s mastery and understanding of your industry. But if you can’t find someone to write blog posts that expand on (or rethink) the existing knowledge base, save your ink.

Some people curate and compile information in order to create “top 10″ type posts. That kind of content can be helpful for readers who don’t have time to source content on their own, but I wouldn’t suggest it as the core content strategy for a company’s blog. If that’s all you have time for, focus on social media instead.

5. Your site is all timely content

A blog can help you shape the message around your industry and your brand, but what if your brand is built entirely around messaging? The BBC doesn’t need a blog because any reader would expect what they’re reading to be timely content and to adhere to the BBC’s standard voice. If readers want to engage with the content by commenting on the articles, they can.

If you can explain the value that blogs.foxnews.com adds to the Fox News site, you’ve got a keener eye for content strategy than I do. My guess, from the empty blog bubbles here, is that this is a failed (or abandoned) experiment and will soon disappear.

6. Your business is truly offline

There’s one final reason that blogging might not fit your business model, and that’s if you have chosen not to enter the digital realm. I had lunch with a high-end jeweler in India recently where he was debating whether to go online (he was worried that his designs might get stolen) or continue to do business in person the way his family had done for at least three generations.

If you are successful at selling your products offline, especially if your product has as much variation as a gemstone, an argument can be made for staying offline entirely.

When you should be blogging

Now that we’ve looked at some times it’s okay not to have a blog, let’s take a quick, expanded look at five reasons you might want to blog as part of your content marketing strategy (just in case you thought you’d gotten off scot-free by almost fitting into one of the boxes above).

1. You want traffic to your website

Conventional wisdom goes that the more pages you build, the more chances you have to rank. Heck, the more (good) content you create on your blog, the more collateral you have to showcase on your social channels, in email, and anywhere else you want to.

2. You want to expand your audience

If the content you’re creating is truly awesome, people will share it and find it and love it. Some of those people will be potential customers who haven’t even heard of you before. Keep up the excellence and you might just keep them interested.

3. You want to connect with customers

That blog is a fantastic place to answer FAQs, play with new ideas, and show off the humanity of all those fantastic individuals you have working for you. All of those things help customers get to know you, plus they can engage with you directly via the comments. You might just find ideas for new campaigns and even new products just by creating that venue for conversation.

4. You have something to add to the discussion

Do you really have a fresh perspective on what’s going on in your industry? Help others out by sharing your interesting stories and thoughtful commentary. You’re building your authority and the authority of your company at the same time.

5. You’re ready to invest in your future

Content is a long game, so the payoffs from blogging may be farther down the road than you might hope. But if a blog is right for your company, you’re giving yourself the chance to start shaping the message about your industry and your company the day you publish your first post. Keep at it and you might find that you start attracting customers from amongst your followers.

The gist

Don’t blog just because someone told you to. A blog is a huge investment and sustaining that blog can take a lot of work. But there are a lot of good reasons to dig in and blog like you mean it.

What’s your decision? Do you have a good reason that you’ve decided to abstain from blogging? Or have you decided that a blog is the right thing for your business? Help others carefully consider their investment in blogging by sharing your story in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source::

      

Thumbnail for 2753

Google Quietly Updates The International Targeting Hreflang Webmaster Tools Reporting

By | Uncategorized | No Comments

By Barry Schwartz

Google has updated the “accuracy” of the reporting within the Google Webmaster Tools International Targeting section.

On November 18th, Google said:

Google refined the accuracy for reporting hreflang tags. This update might show as a sudden change in the report graph.

You now may see an “update” line in the graph that conveys when the reporting for hreflang tags was deployed. Here is a screen shot from Rahul Mistry of Blue Post Digital, who informed us of this update:

In mid-July, Google officially launched International Targeting within Google Webmaster Tools.

Not many webmasters picked up on this update from a week or so ago.

If you notice the graph change drastically on the update date, there is no reason to worry.

The post Google Quietly Updates The International Targeting Hreflang Webmaster Tools Reporting appeared first on Search Engine Land.

Source::

      

The Fascinating Way Google Is Recruiting New Talent by @mattsouthern

By | Uncategorized | No Comments

By Matt Southern

If you’re a coder who aspires to work for Google, don’t worry about going to them — they’ll come to you. Using none other than its own search engine, Google is inviting coders to take a test to prove they have the skills to work there. A user on the Hacker News message board reports to have stumbled on this test after searching for a topic related to Python. You can imagine his surprised when the page “suddenly split in the middle with some text saying something to the effect of “You speak our language, would you like to take […]

The post The Fascinating Way Google Is Recruiting New Talent by @mattsouthern appeared first on Search Engine Journal.

Source::

      

Thumbnail for 917

The Danger of Crossing Algorithms: Uncovering The Cloaked Panda Update During Penguin 3.0

By | Uncategorized | No Comments

By glenngabe

Posted by glenngabe

Penguin 3.0 was one of the most anticipated algorithm updates in recent years when it rolled out on October 17, 2014. Penguin hadn’t run for over a year at that point,
and there were many webmasters sitting in Penguin limbo waiting for recovery. They had cleaned up their link profiles, disavowed what they could, and were
simply waiting for the next update or refresh. Unfortunately, Google was wrestling with the algo internally and over twelve months passed without an
update.

So when Pierre Far finally
announced Penguin 3.0 a few days later on October 21, a few things
stood out. First, this was
not a new algorithm like Gary Illyes had explained it would be at SMX East. It was a refresh and underscored
the potential problems Google was battling with Penguin (cough, negative SEO).

Second, we were not seeing the impact that we expected. The rollout seemed to begin with a heavier international focus and the overall U.S impact has been
underwhelming to say the least. There were definitely many fresh hits globally, but there were a number of websites that should have recovered but didn’t
for some reason. And many are still waiting for recovery today.

Third, the rollout would be slow and steady and could take weeks to fully complete. That’s unusual, but makes sense given the microscope Penguin 3.0 was
under. And this third point (the extended rollout) is even more important than most people think. Many webmasters are already confused when they get hit
during an acute algorithm update (for example, when an algo update rolls out on one day). But the confusion gets exponentially worse when there is an
extended rollout.

The more time that goes by between the initial launch and the impact a website experiences, the more questions pop up. Was it Penguin 3.0 or was it
something else? Since I work heavily with algorithm updates, I’ve heard similar questions many times over the past several years. And the extended Penguin
3.0 rollout is a great example of why confusion can set in. That’s my focus today.


Penguin, Pirate, and the anomaly on October 24

With the Penguin 3.0 rollout, we also had Pirate 2 rolling out. And yes, there are
some websites that could be impacted by both. That added a layer of complexity to the situation, but nothing like what was about to hit. You see, I picked
up a very a strange anomaly on October 24. And I clearly saw serious movement on that day (starting late in the day ET).

So, if there was a third algorithm update, then that’s
three potential algo updates rolling out at the same time. More about this soon,
but it underscores the confusion that can set in when we see extended rollouts, with a mix of confirmed and unconfirmed updates.


Penguin 3.0 tremors and analysis

Since I do a lot of Penguin work, and have researched many domains impacted by Penguin in the past, I heavily studied the Penguin 3.0 rollout and published a blog post based on analyzing the first ten days of Penguin 3.0 which included some interesting findings for sure.

And based on the extended rollout, I definitely saw Penguin tremors beyond the initial October 17 launch. For example, check out the screenshot below of a
website seeing Penguin impact on October 17, 22, and 25.

But as mentioned earlier, something else happened on October 24 that set off sirens in my office. I started to see serious movement on sites impacted by
Panda, and not Penguin. And when I say serious movement, I’m referring to major traffic gains or losses all starting on October 24. Again, these were sites heavily dealing with Panda and had
clean link profiles. Check out the trending below from October 24 for several
sites that saw impact.


A good day for a Panda victim:



A bad day for a Panda victim:



And an incredibly frustrating day for a 9/5 recovery that went south on 10/24:

I saw this enough that I tweeted heavily about it and
included a section about Panda in my Penguin 3.0 blog post. And
that’s when something wonderful happened, and it highlights the true beauty and power of the internet.

As more people saw my tweets and read my post, I started receiving messages from other webmasters explaining that
they saw the same exact thing, and on their websites dealing with Panda and not Penguin. And not only
did they tell me about, they
showed me the impact.

I received emails containing screenshots and tweets with photos from Google Analytics and Google Webmaster Tools. It was amazing to see, and it confirmed
that we had just experienced a Panda update in the middle of a multi-week Penguin rollout. Yes, read that line again. Panda during Penguin, right when the
internet world was clearly focused on Penguin 3.0.

That was a sneaky move Google… very sneaky. :)

So, based on what I explained earlier about webmaster confusion and algorithms, can you tell what happened next? Yes, massive confusion ensued. We had the
trifecta of algorithm updates with Penguin, Pirate, and now Panda.


Webmaster confusion and a reminder of the algo sandwich from 2012

So, we had a major algorithm update during two other major algorithm updates (Penguin and Pirate) and webmaster confusion was hitting extremely high
levels. And I don’t blame anyone for being confused. I’m neck deep in this stuff and it confused me at first.

Was the October 24 update a Penguin tremor or was this something else? Could it be Pirate? And if it was indeed Panda, it would have been great if Google told
us it was Panda! Or did they want to throw off SEOs analyzing Penguin and Pirate? Does anyone have a padded room I can crawl into?

Once I realized this was Panda, and started to communicate the update via Twitter and my blog, I had a number of people ask me a very important question:


“Glenn, would Google really roll out two or three algorithm updates so close together, or at the same time?”

Why yes, they would. Anyone remember the algorithm sandwich from April of 2012? That’s when Google rolled out Panda on April 19, then Penguin 1.0 on April 24,
followed by Panda on April 27. Yes, we had three algorithm updates all within ten days. And let’s not forget that the Penguin update on April 24, 2012 was the
first of its kind! So yes, Google can, and will, roll out multiple major algos around the same time.

Where are we headed? It’s fascinating, but not pretty


Panda is near real-time now

When Panda 4.1 rolled out on September 23, 2014, I immediately disliked the title and version number of the update. Danny Sullivan named it 4.1, so it stuck. But for
me, that was not 4.1… not even close. It was more like 4.75. You see, there have been a number of Panda tremors and updates since P4.0 on May 20,
2014.

I saw what I was calling “tremors”
nearly weekly based on having access to a large amount of Panda data (across sites, categories, and countries).
And based on what I was seeing, I reached out to John Mueller at Google to clarify the tremors. John’s response was great and confirmed what I was seeing.
He explained that there
was not a set frequency for algorithms like Panda. Google can roll out an algorithm, analyze the
SERPs, refine the algo to get the desired results, and keep pushing it out. And that’s exactly what I was seeing (again, almost weekly since Panda 4.0).


When Panda and Penguin meet in real time…

…they will have a cup of coffee and laugh at us. :) So, since Panda is near-real time, the crossing of major algorithm updates is going to happen.
And we just experienced an important one on October 24 with Penguin, Pirate, and Panda. But it could (and probably will) get more chaotic than what we have now.
We are quickly approaching a time where major algorithm updates crafted in a lab will be unleashed on the web in near-real time or in actual real time.

And if organic search traffic from Google is important to you, then pay attention. We’re about to take a quick trip into the future of Google and SEO. And
after hearing what I have to say, you might just want the past back…


Google’s brilliant object-oriented approach to fighting webspam

I have presented at the past two SES conferences about Panda, Penguin, and other miscellaneous disturbances in the force. More about those “other
disturbances” soon. In my presentation, one of my slides looks like this:

Over the past several years, Google has been using a brilliant, object-oriented approach to fighting webspam and low quality content. Webspam engineers can
craft external algorithms in a lab and then inject them into the real-time algorithm whenever they want. It’s brilliant because it isolates specific
problems, while also being extremely scalable. And by the way, it should scare the heck out of anyone breaking the rules.

For example, we have Panda, Penguin, Pirate, and Above the Fold. Each was crafted to target a specific problem and can be unleashed on the web whenever
Google wants. Sure, there are undoubtedly connections between them (either directly or indirectly), but each specific algo is its own black box. Again,
it’s object-oriented.

Now, Panda is a great example of an algorithm that has matured to where Google highly trusts it. That’s why Google announced in June of 2013 that Panda
would roll out monthly, over ten days. And that’s also why it matured even more with Panda 4.0 (and why I’ve seen tremors almost weekly.)

And then we had Gary Illyes explain that Penguin was moving along the same path. At SMX East,
Gary explained that the new Penguin algorithm (which clearly didn’t roll out on October 17) would be structured in a way where subsequent updates could be rolled out more easily.
You know, like Panda.

And by the way, what if this happens to Pirate, Above the Fold, and other algorithms that Google is crafting in its Frankenstein lab? Well my friends, then
we’ll have absolute chaos and society as we know it will crumble. OK, that’s a bit dramatic, but you get my point.

We already have massive confusion now… and a glimpse into the future reveals a continual flow of major algorithms running in real-time, each that
could pummel a site to the ground. And of course, with little or no sign of which algo actually caused the destruction. I don’t know about you, but I just
broke out in hives. :)


Actual example of what (near) real-time updates can do

After Panda 4.0, I saw some very strange Panda movement for sites impacted by recent updates. And it underscores the power of near-real time algo updates.
As a quick example,
temporary Panda recoveries can happen if you
don’t get out of the gray area enough. And now that we are seeing Panda tremors almost weekly, you can experience potential turbulence several times per
month.

Here is a screenshot from a site that recovered from Panda, didn’t get out of the gray area and reentered the strike zone, just five days later.

Holy cow, that was fast. I hope they didn’t plan any expensive trips in the near future. This is exactly what can happen when major algorithms roam the web
in real time. One week you’re looking good and the next week you’re in the dumps. Now, at least I knew this was Panda. The webmaster could tackle more
content problems and get out of the gray area… But the ups and downs of a Panda roller coaster ride can drive a webmaster insane. It’s one of the
reasons I recommend making
significant changes when
you’ve been hit by Panda. Get as far out of the gray area as possible.


An “automatic action viewer” in Google Webmaster Tools could help (and it’s actually being discussed internally by Google)

Based on webmaster confusion, many have asked Google to create an “automatic action viewer” in Google Webmaster Tools. It would be similar to the “manual
actions viewer,” but focused on algorithms that are demoting websites in the search results (versus penalties). Yes, there is a difference by the way.

The new viewer would help webmasters better understand the types of problems that are being impacted by algorithms like Panda, Penguin, Pirate, Above the
Fold, and others. Needless to say, this would be incredibly helpful to webmasters, business owners, and SEOs.

So, will we see that viewer any time soon? Google’s John Mueller
addressed this question during the November 3 webmaster hangout (at 34:54).

John explained they are trying to figure something out, but it’s not easy. There are so many algorithms running that they don’t want to provide feedback
that is vague or misleading. But, John did say they are discussing the automatic action viewer internally. So you never know…


A quick note about Matt Cutts

As many of you know, Matt Cutts took an extended leave this past summer (through the end of October). Well, he announced on Halloween that he is extending his leave into 2015. I won’t go crazy here talking about his decision overall, but I will
focus on how this impacts webmasters as it relates to algorithm updates and webspam.

Matt does a lot more than just announce major algo updates… He actually gets involved when collateral damage rears its ugly head. And there’s not a
faster way to rectify a flawed algo update than to have Mr. Cutts involved. So before you dismiss Matt’s extended leave as uneventful, take a look at the
trending below:

Notice the temporary drop off a cliff, then 14 days of hell, only to see that traffic return? That’s because Matt got involved. That’s the
movie blog fiasco from early 2014 that I heavily analyzed. If
Matt was not notified of the drop via Twitter, and didn’t take action, I’m not sure the movie blogs that got hit would be around today. I told Peter from
SlashFilm that his fellow movie blog owners should all pay him a bonus this year. He’s the one that pinged Matt via Twitter and got the ball rolling.

It’s just one example of how having someone with power out front can nip potential problems in the bud. Sure, the sites experienced two weeks of utter
horror, but traffic returned once Google rectified the problem. Now that Matt isn’t actively helping or engaged, who will step up and be that guy? Will it
be John Mueller, Pierre Far, or someone else? John and Pierre are greatly helpful, but will they go to bat for a niche that just got destroyed? Will they
push changes through so sites can turn around? And even at its most basic level, will they even be aware the problem exists?

These are all great questions, and I don’t want to bog down this post (it’s already incredibly long). But don’t laugh off Matt Cutts taking an extended
leave. If he’s gone for good, you might only realize how important he was to the SEO community
after he’s gone. And hopefully it’s not because
your site just tanked as collateral damage during an algorithm update. Matt might be
running a marathon or trying on new Halloween costumes. Then where will you be?


Recommendations moving forward:

So where does this leave us? How can you prepare for the approaching storm of crossing algorithms? Below, I have provided several key bullets that I think
every webmaster should consider. I recommend taking a hard look at your site
now, before major algos are running in near-real time.

  • Truly understand the weaknesses with your website. Google will continue crafting external algos that can be injected into the real-time algorithm.
    And they will go real-time at some point. Be ready by cleaning up your site now.
  • Document all changes and fluctuations the best you can. Use annotations in Google Analytics and keep a spreadsheet updated with detailed
    information.
  • Along the same lines, download your Google Webmaster Tools data monthly (at least). After helping many companies with algorithm hits, that
    information is incredibly valuable, and can help lead you down the right recovery path.
  • Use a mix of audits and focus groups to truly understand the quality of your site. I mentioned in my post about

    aggressive advertising and Panda

    that human focus groups are worth their weight in gold (for surfacing Panda-related problems). Most business owners are too close to their own content and
    websites to accurately measure quality. Bias can be a nasty problem and can quickly lead to bamboo-overflow on a website.
  • Beyond on-site analysis, make sure you tackle your link profile as well. I recommend heavily analyzing your inbound links and weeding out unnatural
    links. And use the disavow tool for links you can’t remove. The combination of enhancing the quality of your content, boosting engagement, knocking down
    usability obstacles, and cleaning up your link profile can help you achieve long-term SEO success. Don’t tackle one quarter of your SEO problems. Address
    all of them.
  • Remove barriers that inhibit change and action. You need to move fast. You need to be decisive. And you need to remove red tape that can bog down
    the cycle of getting changes implemented. Don’t water down your efforts because there are too many chefs in the kitchen. Understand the changes that need
    to be implemented, and take action. That’s how you win SEO-wise.


Summary: Are you ready for the approaching storm?

SEO is continually moving and evolving, and it’s important that webmasters adapt quickly. Over the past few years, Google’s brilliant object-oriented
approach to fighting webspam and low quality content has yielded algorithms like Panda, Penguin, Pirate, and Above the Fold. And more are on their way. My
advice is to get your situation in order now, before crossing algorithms blend a recipe of confusion that make it exponentially harder to identify, and
then fix, problems riddling your website.

Now excuse me while I try to build a flux capacitor. :)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source::

      

7 AdWords Features We Would Like to See in 2015 by @Rocco_Zebra_Adv

By | Uncategorized | No Comments

By Rocco Baldassarre

Google AdWords is adding new features at a fast pace, making it easier than ever for advertisers to manage a campaign. With 2015 approaching, I thought it would be great to compile a list of features that would make our ad optimization procedure more effective! 1. Customize The Device Bid Modifier at The Keyword Level We all know the importance of keywords in our accounts. Some keywords perform betters than others, and we are able to customize their bids based on that. However, the behavior of mobile users is different from people navigating with a desktop device. As of today, […]

The post 7 AdWords Features We Would Like to See in 2015 by @Rocco_Zebra_Adv appeared first on Search Engine Journal.

Source::

      

Thumbnail for 684

Simple Tips To Set The Stage For Local SEO In 2015

By | Uncategorized | No Comments

By Greg Gifford

The year is almost over, and many businesses are starting to look forward to 2015 and discuss their marketing plans. Luckily, David Mihm, the local search guru at Moz, just released his annual Local Search Ranking Factors survey, which helps give us local marketers more insight into which ranking factors matter the most.

The survey shows a definite shift toward more traditional web ranking factors. Last year’s Local Search Ranking Factors survey had Google Places and Citations weighted heavily, but this year’s study shows that on-site signals and links are the most powerful factors.

This shift is consistent with Google’s recent local ranking algorithm update, Pigeon. Many Local SEOs claimed they weren’t hit by Pigeon – but it’s more likely that, because they took a more wholesome approach to local SEO, their sites simply had more authority to begin with.

The most important point we try to hammer home to potential clients is that you can’t fool the nerds at Google. Everything you do, both on and off your site, should be working toward the end goal of making your user experience awesome… not trying to fool Google into placing you higher on search results pages.

So, taking what we’ve been able to figure out about the Pigeon update and adding in the results from the 2014 Local Search Ranking Factors survey, here are two simple tips to help you set the stage for Local Search success in 2015:

  1. Be Awesome
  2. Earn Awesome Links

Yes, it’s really that simple… but at the same time, it’s really not that easy for local businesses. Take a look at your competitors in your vertical – nearly every website has the same or similar content, and most sites don’t have that many inbound links.

Okay, So How Are You Supposed To Be Awesome?

The best thing you can do for Local Search success in 2015 is to take all the energy you put into trying to fool Google and instead use that energy to make your site better.

Take a long, hard look at your site and look at your competitors’ sites. What can you do to be better? You know that your potential customers will be looking at multiple sites, so make your site the best in your vertical.

Make sure you’re avoiding these common pitfalls – they’re all basic, but we still see far too many sites tripping up on these:

  1. No Home Page Content. Your customers (and search engines) need to know what you’re all about. If your home page has a slider/banner and just a few sentences, you need to add more useful content there immediately.
  2. Only A Few Sentences On A Page. Your customers (and search engines) are checking your website for useful, relevant information. If you offer a product or service, don’t just say, “We sell X, call us for more information!” Today’s shoppers want immediate information, so you need to pack every page with useful content.
  3. Spamming Keywords. Far too many websites rely on this outdated tactic. You’re not going to rank well everywhere in your state simply because you listed out 100 cities separated by commas on your home page. Does that huge list of cities provide useful information for customers? No. Does it help you rank in Google? Definitely not. Get rid of the junk and populate your site with relevant, informative content instead.
  4. Awful Title Tags. You’ve got about 500 pixels of width for your title tags; anything longer will be truncated when it’s displayed in search results. The title tag should summarize the page – it shouldn’t be a huge chunk of keywords you’re trying to rank for. Put your primary keyword phrase at the beginning and your business name at the end. If you’ve got 100 keywords stuffed into your title tag, you just look desperate.

Don’t Forget Your Local Optimization

With on-site signals now carrying so much weight, it’s more important than ever to have your local optimization ducks in a row. It won’t do you any good to bang out a ton of citations if your site doesn’t include the local signals that Google expects it to have.

Again, these are old-school basics, but we hardly see any websites correctly optimizing for local areas:

  1. Include City/ST in your title tag. Remember, the title tag is incredibly important for optimization, and including your city and state is an important signal for local relevancy.
  2. Include City/ST in your H1 heading. It doesn’t have to be the entire heading in and of itself — what’s important here is to include your city and state in the page heading to further show local relevancy.
  3. Include City/ST in your content. Far too many sites forget to include City/ST information inside the site content. Optimizing for local search won’t work unless you’re talking about your local area in your content.
  4. Include City/ST in your alt text on images. It’s amazing how many times we see sites that don’t include alt text. Remember, Google can’t see what’s in your images, so alt text helps provide a better understanding of your page content. Including City/ST information can really help boost local relevancy.
  5. Include City/ST in your URL. If you’ve got the ability to edit your URL structure, try to include your city and state information in your URLs. Again, this can go a long way toward providing a stronger local signal to both customers and Google. Important Note: if you’re going to update your URLs, don’t forget to set up 301 redirects so that the old address is permanently pointed to the new one.

These are all just specific tactics to help with the main goal: to make your site more awesome. Stop thinking about how to make your site rank, and start thinking about how to make your site the best in your niche. That’s how you’re going to get your site to rank better and convert more visitors.

The post Simple Tips To Set The Stage For Local SEO In 2015 appeared first on Search Engine Land.

Source::