It’s not enough to just share content through social channels – you need to actively participate in the community, too. Got a Twitter account? Then join in group discussions with relevant hashtags. Is your audience leaving comments on your Facebook posts? Answer questions and engage with your readers. Nothing turns people off quicker than using social media as a broadcast channel – use social media as it was intended and actually interact with your fans.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.

We now have a dedicated SEO strategist who, among other things, develops 90 day plans for our websites. 90 days isn't longterm planning, but at least we have a strategic objective for the quarter. He also works closely with our UX team to identify the target audience - the crew that does the persona research and focus groups prior to the wireframe stage.

LinkedIn has become much more than a means of finding another job. The world’s largest professional social network is now a valuable publishing platform in its own right, which means you should be posting content to LinkedIn on a regular basis. Doing so can boost traffic to your site, as well as increase your profile within your industry – especially if you have a moderate to large following.
Although this is a step-by-step series, everyone's methods will (and should) vary, so it really depends on how much time you think it will take (if you're billing hourly).  What tools do you have at your disposal vs. how much researching for information will you have to do on your own? Will you have to pay for research reports or companies? Do you pay a monthly service for data or research?

I have been trying to produce more content because I believed the lack of traffic was to the small amount of content, but after reading your blog post, i’m beginning to doubt wether or not this is quality content. I will definitely do more research on influencers on my niche, now I have to figure out how to get their attention with my kind of content.


Guest post on other blogs, and invite other bloggers to guest post on your site. Guest posts are a great way to drive traffic between related blogs, and allow you to gain readers that might not normally make it to your site. Be sure to allow any guest posters to link back to their own site, and share any guest posts the same as you would your own posts.
Awesome tips Brian. Always enjoy your posts. My question is, how can I boost traffic significantly if my keyword has pretty low search volume (around 100 monthly searches based on keyword planner)? I’ve been trying to expand my keyword list to include broader terms like “customer experience” but as you know that is super competitive. Do you have any suggestions for me? Thanks in advance.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
Expert roundups have been abused in the Internet Marketing industry, but they are effective for several reasons. First, you don’t have to create any content. The “experts” create all the content. Second, it is ego bait. Meaning, anyone who participated in the roundup will likely share it with their audience. Last, it is a great way to build relationships with influencers.
Who doesn’t love quizzes? They are enjoyable, shareable and hard to resist. Design a quiz for your website that somehow relates to your brand. If your company sells jeans, for instance, you could create a quiz called, “What are the best jeans for your body?” and in the results, show the brand’s recommended jeans. But remember—before showing results, be sure to capture the visitor’s e-mail address.

I am a little confused on your first point. Sorry if it is a simple one to understand and I’m just missing it. What good would finding dead links on Wiki do for my personal website? I thought you would explain how to find dead links faster within my own site… but it seems that your tip is way more valuable than that. I just don’t quite understand what I do to positively affect MY site with this. Any help would be great 🙂 THANKS!


Amazing article. As per my point of view, the best source of traffic in today’s world is nothing but the social networking site. A huge number of people are using social media. So, we can connect with our audience easily. While doing the research, I have found this article: https://www.blurbpointmedia.com/design-social-media-business-marketing-strategy/ which is about the developing the community on the social media. I think the best way to a successful social media account is nothing but the posting different kinds of interesting content on the daily basis!
For example, if a swimming pool business is trying to rank for "fiberglass pools" -- which is receiving 110,000 searches per month -- this short-tail keyword can be the one that represents the overarching topic on which they want to create content. The business would then identify a series of long-tail keywords that relate to this short-tail keyword, have reasonable monthly search volume, and help to elaborate on the topic of fiberglass pools. We'll talk more about these long-tails in the next step of this process.
Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
In the early days of the web, site owners could rank high in search engines by adding lots of search terms to web pages, whether they were relevant to the website or not. Search engines caught on and, over time, have refined their algorithms to favor high-quality content and sites. This means that SEO is now more complex than just adding the right words to your copy.
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[48] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[48] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[49] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
Search engines attempt to rank results for a given search based on their relevance to the topic, and the quality and reliability a site is judged to have. Google, the world’s most popular search engine, uses an ever-evolving algorithm that aims to evaluate sites in the way that a human reader would. This means that a key part of SEO involves ensuring that the website is a unique and relevant resource for readers.
Firstly, a disclaimer – don’t spam Reddit and other similar sites hoping to “hit the jackpot” of referral traffic, because it’s not going to happen. Members of communities like Reddit are extraordinarily savvy to spam disguised as legitimate links, but every now and again, it doesn’t hurt to submit links that these audiences will find genuinely useful. Choose a relevant subreddit, submit your content, then watch the traffic pour in.

He started by finding an offer that resonated with and is relevant to his audience. In his case, his blog was dedicated to teaching people how to use a software called “Sublime Text.” He simply offered a license to the software for the giveaway. By doing this, not only did he increase the chances of success of his giveaway since his incentive was relevant, but he also ensured the quality of subscribers since they were actually people interested in his content. It’s easy to give people an iPad or an iPhone, but how relevant will they be to you at the end of the day?
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
It’s not enough to produce great content and hope that people find it – you have to be proactive. One of the best ways to increase traffic to your website is to use social media channels to promote your content. Twitter is ideal for short, snappy (and tempting) links, whereas Google+ promotion can help your site show up in personalized search results and seems especially effective in B2B niches. If you’re a B2C product company, you might find great traction with image-heavy social sites like Pinterest and Instagram. Here's more advice on making the most of social media marketing.
Fortunately, Google puts more weight on the anchor text of external links anyway. So as long as some of your external links have your target anchors, you’re probably OK with a “Home” button. In fact, I’ve ranked homepages with a “Home” anchor text nav button for some seriously competitive terms. So it’s not a make-or-break ranking signal by any means.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
For example, if a swimming pool business is trying to rank for "fiberglass pools" -- which is receiving 110,000 searches per month -- this short-tail keyword can be the one that represents the overarching topic on which they want to create content. The business would then identify a series of long-tail keywords that relate to this short-tail keyword, have reasonable monthly search volume, and help to elaborate on the topic of fiberglass pools. We'll talk more about these long-tails in the next step of this process.

Facebook is keen to promote streaming video – the success of Twitch.tv has them drooling. This means that Streaming videos are given massive “reach” – more people see them. They’ll show your video to more of your page subscribers, more group members, etc. If the video is good, you’ll get lots of shares and likes, and this can build your audience rapidly.
Text-based content is all well and good, but video can be a valuable asset in both attracting new visitors and making your site more engaging. Data shows that information retention is significantly higher for visual material than it is for text, meaning that video marketing is an excellent way to grab – and hold – your audience’s attention, and boost traffic to your website at the same time.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
You authorize us to bill the payment source You provide to Us for all applicable fees. If Your payment source is declined at any time (including, but not limited to situations where we seek authorizations or charge attempts), we may make up to two attempts to reprocess Your payment source. We reserve the right to disable or cancel Your use of Service immediately.​
Hack #1: Hook readers in from the beginning. People have low attention spans. If you don’t have a compelling “hook” at the beginning of your blogs, people will click off in seconds. You can hook them in by teasing the benefits of the article (see the intro to this article for example!), telling a story, or stating a common problem that your audience faces.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Hey Mischelle, thanks for the input! It’s true, SEO is definitely a long game. You need to lay the foundation and keep improving your site, publish new content and promote what you already have. However, if you keep at it, it can pay off nicely over time. And you are right, picking the right keywords is one of the foundations for SEO success. Thanks for commenting!
Traditionally, defining a target audience involves determining their age, sex, geographic locations, and especially their needs (aka pain points). Check out usability.gov’s description of personas and how to do task analysis & scenarios for more details, or better yet, read Vanessa Fox’s upcoming book about personas related to search and conversion.

Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."

×