A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[48] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[48] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[49] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.


Content gaps – make an inventory of the site’s key content assets, are they lacking any foundational/cornerstone content pieces, non-existent content types, or relevant topic areas that haven’t been covered? What topics or content are missing from your competitors? Can you beat your competitors’ information-rich content assets? Useful guides on Content Gap Analysis:
Getting traffic is always important but one should not worry too much, nothing happens in overnight, Now I read this article and genuinely tried to make my own impression about the post which automatically creates a link to my blog but don’t try hard thinking back links in mind, you always get caught in some or the other way, Panda and Penguin are one such examples.
Not sure exactly why, perhaps I used a number too big and since my page is about classifieds, it probably seemed too much to browse through 1500 ads, I assume? Somewhat like you would post 800 tips for better ranking? Don’t know, will try to change things a bit and see how it goes, but you really gave me some new suggestions to go for with this article. Thanks again 🙂
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Hi there, am interested to try your trick in Wikipedia, but am also not sure of how should I do tht, coz i read some posts saying tht “Please note that Wikipedia hates spams, so don’t spam them; if you do, they can block your IP and/or website URL, check their blocking policy and if they blacklist you, you can be sure that Google may know about it.”
I’m considering a niche that I’m not sure I can find good influencers for – fundraising. School fundraising or charitable fundraising. I’m passionate about it but how would I get my articles shared by influencers? The non-profit sector is somewhat apprehensive about promoting commercial sites, unless it’s fundraising software. The name really says it all: “non”-profit.
Hey Brian, This article is really really awesome, Seriously you have covered all 21 points which i never read on anywhere over internet. Everyone shares basics but here you have shared awesome info specially that face book keyword research and 1000+ words in a post, and the wiki pedia ideas are really good and helpful. learned many things from this article. keep sharing this kind of info thanks
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
If you create memorable content, people will want to come back for more. So instead of churning out lackluster content that can be found anywhere on the web, write higher quality, unique content that caters directly to your audience. Speak your opinion on a subject matter, instead of just objectively providing facts. Create useful, thought-provoking content. Posting three so-so blog posts a week will not be nearly effective as posting one superb blog post per week.
In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.
Expert roundups have been abused in the Internet Marketing industry, but they are effective for several reasons. First, you don’t have to create any content. The “experts” create all the content. Second, it is ego bait. Meaning, anyone who participated in the roundup will likely share it with their audience. Last, it is a great way to build relationships with influencers.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Also make sure that your blog posts are consistent with one another and that each post has the same-sized images, headings and font. Always ensure that your blog post titles don’t lead your visitors astray.  This may seem obvious, but it happens more often than you’d think. For example, if your blog post is titled “The Top 10 Places to Hike in Southern California” but the post itself talks about hiking spots all throughout the entire state of California, you’re probably going to lose visitors. After all, it’s not what they had signed on for!
If your social media profiles contain a link to your website, then you’ve turned your engagement into another channel for website traffic. Just be sure to engage moderately and in a sincere way, and avoid including links to your website in your comments—lest you appear spammy and hurt your online and business reputation. Increased traffic should not be the goal of your engagement, but rather a secondary result.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.

incredible post and just what i needed! i’m actually kinda new to blogging (my first year coming around) and so far my expertise has been in copy writing/seo copy writing. however link building has become tedious for me. your talk about influencing influencers makes perfect sense, but i find it difficult for my niche. my blog site is made as “gift ideas” and holiday shoppers complete with social networks. i get shares and such from my target audience, but i find that my “influencers” (i.e etsy, red box, vat19, etc.) don’t allow dofollow links and usually can’t find suitable sources. I guess my trouble is just prospecting in general.

As a simple example, I recently renovated a Victorian-era house in the UK, and throughout the process, I was looking for various professionals that could demonstrate relevant experience. In this case, having a well-optimized case study showing renovation work on a similar house in the local area would serve as great long-tail SEO content — it also perfectly demonstrates that the contractor can do the job, which perfectly illustrates their credibility. Win-win.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .

On another note, we recently went through this same process with an entire site redesign.  The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor.  It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic.  It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!).  I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
Guest blogging is a two-way street. In addition to posting content to other blogs, invite people in your niche to blog on your own site. They’re likely to share and link to their guest article, which could bring new readers to your site. Just be sure that you only post high-quality, original content without spammy links, because Google is cracking way down on low-quality guest blogging.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
Getting more website visitors does not happen overnight. It takes some effort but we’ve eliminated the hard part for you: knowing what to do in the first place. By using Google My Business and the other safe channels listed above, you can get the right visitors coming to your site and more importantly, more of those visitors converting into customers.
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.

Firstly, really think about what your audience is interested in and what their needs are. As SUCCESS agency CEO, Avin Kline, states, “It’s so easy to forget, but the heart of increasing user engagement is to put yourself in their shoes and add undeniable value to the user. Keep in mind, what marketers think is valuable and what users think is valuable are often two different things.”
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Hi Brian, Awsome content as ever! I’m very interested in your idea of creating an ‘uber’ resource list or expert roundup post i.e. linking out to lots of to other authorities in my niche within one post. But should you always create ‘no-follow’ links to these authority sites to prevent juice from passing to them? And similarly if you sprinkle a few outbound authority links in other posts should they all be ‘no-follow’ or do you think big G ignores ‘no follow’ these days?
For example, we regularly create content on the topic of "SEO," but it's still very difficult to rank well on Google for such a popular topic on this acronym alone. We also risk competing with our own content by creating multiple pages that are all targeting the exact same keyword -- and potentially the same search engine results page (SERP). Therefore, we also create content on conducting keyword research, optimizing images for search engines, creating an SEO strategy (which you're reading right now), and other subtopics within SEO.
A quick search for “SEO ranking factors” will give you all of these answers and myriad others. There is a lot of information out there. And the reality is, while there are likely hundreds of variables working together to determine final placement, much of what is suggested is guesswork. And certainly, not all ranking factors are relevant to every business.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
×