Btw, I was always under the impression that digg and delicious were dying but I’m really mistaken. Your(and Jason’s) thinking is foolproof though. If these guys are already curating content, there’s no reason they wouldn’t want to do more of just that! Seo has become a lot of chasing and pestering…it’s good of you to remind us that there are people out there just waiting to share stuff, too.:)

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
The first relates to internal link structure. I’ve made the mistake you say you’ve seen so often. I have a primary keyword and have used that keyword in the main navigation, linked to a page optimized for that keyword. But I’ve also got a bunch of contextual links in posts pointing to that page, usually with the keyword in the anchor text. I now understand that those internal links aren’t helping much, at least from an SEO perspective. Am I better to remove that keyword and direct link from the menu and simply link the page from multiple posts and pages within the site. Or will I get better results leaving it in the main menu and changing the contextual links in the posts to point to a related page with a different keyword?

This truly amazing and I’m gonna share this with like minded people. I loved the part about flippa. What a great source to get ideas. Building links tends to be the hardest to do, but a few good quality links is all you need now a days to get ranked. I currently rank for a very high volume keyword with only 5 links all with pr 3,4 and good DA and PA. Good links are hard to get but you only need a few which is encouraging! Props for this post!
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.
Google has recently changed how you can use the Google Keyword Planner. Before, everyone who signed up could see the search volume for keywords. Now, it only shows estimates. There is a way to get around this. You need to create a Google Adwords campaign. The amount you spend doesn’t matter. After you do that, you will regain access to the search volume.
Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Search engine optimisation or SEO, has become a huge priority for marketers over the last few years. It’s easy to see why—higher search engine rankings result in more traffic, more leads, and higher sales and conversions. But how, exactly, does it work? How does adding keywords to various site elements improve your chances of ranking well in search engines?

Great article as always. My wife is about to start a business about teaching (mainly) Mums how to film and edit little movies of their loved ones for posterity (www.lovethelittlethings.com launching soon). We have always struggled with thinking of and targeting relevant keywords because keywords like ‘videography’ and ‘family movies’ don’t really some up what she is about. Your article ties in with other learnings we have come across where we obviously need to reach out to right people and get them to share to get her product out there because purely focusing on keywords I don’t think will get us anywhere.

The goal of SEO is to get a web page high search engine ranking. The better a web page's search engine optimization, the higher a ranking it will achieve in search result listings. (Note that SEO is not the only factor that determines search engine page ranks.) This is especially critical because most people who use search engines only look at the first page or two of the search results, so for a page to get high traffic from a search engine, it has to be listed on those first two pages, and the higher the rank, the closer a page is to the number one listing, the better.  And whatever your web page's rank is, you want your website to be listed before your competitor's websites if your business is selling products or services over the internet.

Fortunately, Google puts more weight on the anchor text of external links anyway. So as long as some of your external links have your target anchors, you’re probably OK with a “Home” button. In fact, I’ve ranked homepages with a “Home” anchor text nav button for some seriously competitive terms. So it’s not a make-or-break ranking signal by any means.
×