While short-tail keywords are often searched more frequently, it is more difficult to rank for them on search engines. Targeting long-tail keywords, on the other hand, gives you a better chance of ranking higher (even on the first page) for queries specific to your products and services—and higher ranking means more traffic. Plus, as search engines and voice-to-text capabilities advance, people are using more specific phrases to search online. There are many free tools available to help you find keywords to target, such as Answer the Public.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Search engine optimisation or SEO, has become a huge priority for marketers over the last few years. It’s easy to see why—higher search engine rankings result in more traffic, more leads, and higher sales and conversions. But how, exactly, does it work? How does adding keywords to various site elements improve your chances of ranking well in search engines?
Usually Search-engines automatically crawl your articles if it is high-quality but you should also try to submit your blog to search engines like Google, Bing, and Ask etc. Search engines like Google have already simplified the way of submitting your content. Google Webmaster Tools makes it easy for every webmaster to get their website crawled faster.
It’s free to be active in online groups and on websites that are relevant to your business and community—and it helps you to obtain more traffic. Comment on blogs and social media posts, answer questions people are posting, and participate in conversations about your industry. The more you engage with your community, the more exposure and profile visits you get.
It’s rare to come across new SEO tips worth trying. And this post has tons of them. I know that’s true BECAUSE…I actually read it all the way to the end and downloaded the PDF. What makes these great is that so many are a multiple step little strategy, not just the one-off things to do that clients often stumble across and ask if they are truly good for SEO. But there are also some nice one-off tips that I can easily start using without ramping up a new project.
Social media is one of the most popular free marketing tools around, and plays a role in driving traffic to your website. Use Facebook, Instagram, and LinkedIn to promote blog posts and other useful pages on your website. This way you can turn your social media audience into website visitors, and draw traffic from their networks if you post shareable content.
Like the hundreds of people already, I thought this was an amazing post. You have a great way of breaking things down into ways that the average reader will be able to understand and make actionable. I think this is a great resource for our readers, so I included it in my monthly roundup of the best SEO, social media, and content marketing articles. https://www.northcutt.com/blog/2014/02/january-resource-round-up-the-best-of-seo-social-media-and-content-marketing/
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
We now have a dedicated SEO strategist who, among other things, develops 90 day plans for our websites. 90 days isn't longterm planning, but at least we have a strategic objective for the quarter. He also works closely with our UX team to identify the target audience - the crew that does the persona research and focus groups prior to the wireframe stage.
Amazing article. As per my point of view, the best source of traffic in today’s world is nothing but the social networking site. A huge number of people are using social media. So, we can connect with our audience easily. While doing the research, I have found this article: https://www.blurbpointmedia.com/design-social-media-business-marketing-strategy/ which is about the developing the community on the social media. I think the best way to a successful social media account is nothing but the posting different kinds of interesting content on the daily basis!
As a simple example, I recently renovated a Victorian-era house in the UK, and throughout the process, I was looking for various professionals that could demonstrate relevant experience. In this case, having a well-optimized case study showing renovation work on a similar house in the local area would serve as great long-tail SEO content — it also perfectly demonstrates that the contractor can do the job, which perfectly illustrates their credibility. Win-win.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
Fantastic stuff, as usual, Brian. The First Link Priority Rule is always one that causes me great angst. I often get torn between search engines and usability when it comes to the main navigation bar. And, I’ve never known what the heck to do about the “Home” link. You can hardly target your keywords with that one without it being anything but awkward.
I’d add one thing to number 5: Writing good copy is crucial not just for your Title/snippet, but for your whole page, especially your landing page. You want people to stay on your page for a while and (hopefully) even navigate to other pages you have. Google looks at bounce rate and where they go after they hit your page. Learning to write good copy can not only increase conversion (if you’re selling something) but make your content more impactful and engaging. There are free books at most libraries or online to help.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..." Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.