Current search engine optimization focuses on techniques such as making sure that each web page has appropriate title tags and that the content is not "thin" or low-quality. High-quality content is original, authoritative, factual, grammatically correct, and engaging to users. Poorly edited articles with spelling and grammatical errors will be demoted by search engines. For more information on thin content see More Guidance on Building High-quality Sites.
Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
I definitely learned tons of new things from your post. This post is old, but I didn’t get the chance to read all of it earlier. I’m totally amazed that these things actually exist in the SEO field. What I liked most is Dead Links scenario on wikipedia, Flippa thing, Reddit keyword research, and at last, the facebook ad keyword research. Its like facebook is actually being trolled for providing us keywords thinking they are promoting ads.
Nothing looks sloppier than websites that don’t abide by any sort of style guide. Is your blog section a complete deviation from your website? If so, this very well could throw off your visitors and decrease engagement. Instead, make sure that all of your web pages are consistent in design, font and even voice. For instance, if you use a very formal tone on your homepage, but a super casual tone in your blog posts, this could highlight brand inconsistency.
I consulted a few years ago before Yahoo and CNET and my clients were all small businesses, even friends' sites.  No matter the size of the project, you can still try to get some insight into your target audiences and what they need or want.  I mentioned in a previous comment I used Search once to determine sentiment on a site vs. it's competitors by searching for a feature the site and its competitors all had, along with "like", "love", "hate", "wish", etc.  I also took note of who the people were who said those things and where they were talking (forums, twitter, etc).  It's a hacked manual approach and although not nearly as quality as a good market research report, at least I have a llittle bit of insight before going out to make site recommendations based solely on tags & links.  If you're recommending the site build things that people want (and fix or remove things that they dont), you're more likely to gain links and traffic naturally.

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.


Start browsing through articles in the same category as your content. Like the articles you genuinely like, and downvote the ones you’re not interested in. Do this for a few minutes every day.This step is very important – StumbleUpon uses the data to learn what kind of content you like. When you submit content, StumbleUpon will show it to other users who like the same kind of content.Act like your ideal reader, and that’s who StumbleUpon will share your content with.
The strength of your link profile isn’t solely determined by how many sites link back to you – it can also be affected by your internal linking structure. When creating and publishing content, be sure to keep an eye out for opportunities for internal links. This not only helps with SEO, but also results in a better, more useful experience for the user – the cornerstone of increasing traffic to your website.
I can feel the excitement in your writing, and thanks for all this free info you know how to get loyal subscribers, I believe you are one of the best in the business, no up selling just honesty, its so refreshing, i cant keep up with you I have only just finished the awesome piece of content you told me to write and just about to modify it then finally start promoting, i will be looking at this also THANK YOU, PS i couldn’t make your last course but i will get on board for the next one

You mentioned: "many times clients have already done this work.  Ask them for copies of their market research reports when you start a project.  It will save you a ton of time and effort!"  We do this with most of our clients, like you said we have found that around 75% of the have some kind of Market research done, that saves you a lot of time and helps setting up the right SEO Strategy. 
People love reading about results. That’s because it’s one of the best ways to learn. You can read information all day, but results show you the practical application of the information. Create content showing real life results. It’s easy in my industry because results are all that matter. But this can work in other industries as well. Here are some non-marketing examples:
Hello Brian, really such an informative article and is more meaningful as you provided screen shots. I have noticed that articles with images bring more value to understand the things. I have just started my career in this industry and thus keep looking for some good articles/blogs that are meaningful and help me to implement tips in my work apart from my seniors instructions. I guess this was I can prove them about my caliber 🙂

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.


Yesterday I was re doing our process for ideas and alltop was a part of it. Now I have also known it was a bit spammy (some of my grey sites are featured ) but now it seems way too bad. You have places like new York times next to random adsense blog x. Guy kawasaki needs to really start giving some sort of influence ranking or at least culling the total crap ones.
Each organic search engine ranking places emphasis on variable factors such as the design and layout, keyword density and the number of relevant sites linking to it. Search engines constantly update and refine their ranking algorithms in order to index the most relevant sites. Other variables that have an impact on search engine placement include the following:
Hi there, am interested to try your trick in Wikipedia, but am also not sure of how should I do tht, coz i read some posts saying tht “Please note that Wikipedia hates spams, so don’t spam them; if you do, they can block your IP and/or website URL, check their blocking policy and if they blacklist you, you can be sure that Google may know about it.”

Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
×