SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
For my Adsense plugin which you can get here https://wordpress.org/plugins/adsense-made-easy-best-simple-ad-inserter/ I’ve created a PRO version (https://www.seo101.net/adsense-made-easy-pro/) that is available to those that sign up for my mailing list. It’s not much but it gets me 5 to 6 subscibers a day. And best of all I know exactly what my subscribers are interested in… WordPress and Adsense:)
Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
Getting free website traffic may not cost you monetarily, but it will require effort on your part. However, the effort you put in will equate to the quality of the traffic you generate. As mentioned above, there is no point in getting more traffic to your website if those visitors are not likely to engage with your pages, convert into leads, or become customers.

Great post, your knowledge and innovative approach never fails to amaze me! This is certainly the first time I’ve heard someone suggest the Wikipedia dead link technique. It’s great that you’re getting people to think outside of the box. Pages like reddit are great for getting keywords and can also be used for link building although this can be difficult to get right. Even if you don’t succeed at using it to link build it’s still a really valuable platform for getting useful information. Thanks!
There are a number of ways to optimize your website for conversion—such as by including calls to action and lead capture forms in the right places, providing the information your visitors are seeking, and making navigation easy and intuitive. But the first step is to be attracting the right visitors to your site in the first place. Your goal when it comes to website traffic is to be driving more qualified visitors to your site. That is, those who are most likely to convert into leads and customers.
In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.

Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
SEO often involves improving the quality of the content, ensuring that it is rich in relevant keywords and organizing it by using subheads, bullet points, and bold and italic characters. SEO also ensures that the site’s HTML is optimized such that a search engine can determine what is on the page and display it as a search result in relevant searches. These standards involve the use of metadata, including the title tag and meta description. Cross linking within the website is also important.
Product images. If you think images don't play a role, think again. When many consumers search for products in the search engines, not only are they looking at the "Web" results, but they're also looking at the "images" results. If you have quality images of that product on your site -- and the files' names contain relevant keywords -- these images will rank well in search engines. This avenue will drive a lot of traffic to your site, as potential customers will click on that image to find your store.
Backlinks. If content is king, then backlinks are queen. Remember, it's not about which site has the most links, but who has the most quality links pointing back to their website. Build backlinks by submitting monthly or bi-monthly press releases on any exciting company, and contacting popular blogs in your niche to see how you can work together to get a backlink from their website. Create the best possible product site you can, so people talking about the products you sell will link back. Try creating graphics or newsworthy content that will influence bloggers and news websites to link that content.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.

Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .
Thanks Jure. That actually makes sense. Exactly: I’ve tested lowering the number of tips in a few posts and it’s helped CTR/organic traffic. One thing to keep in mind is that the number can also be: the year, time (like how long it will take to find what someone needs), % (like 25% off) etc. It doesn’t have to be the number of tips, classified ads, etc.
I’m considering a niche that I’m not sure I can find good influencers for – fundraising. School fundraising or charitable fundraising. I’m passionate about it but how would I get my articles shared by influencers? The non-profit sector is somewhat apprehensive about promoting commercial sites, unless it’s fundraising software. The name really says it all: “non”-profit.
Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .
Great article, learned a lot from it! But I still really get it with the share trigger and right content. For instance, the influencers now care a lot about the new Koenigsegg Agera RS >> https://koenigsegg.com/blog/ (Car). I thought about an article like “10 things you need to know about the Koenigsegg Agera RS”. The only problem is that I don’t know which keywords I should use and how i can put in share triggers.
To gain more customer engagement, the website must reach its visitors/customers efficiently. Obviously, you want the visitors to read your site content. Check the forms and click through on your Call To Actions (CTA’s) when they arrive on your web page. These features initiate user engagement in action, but it is essential to comprehend the in-depth analysis.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
If you're looking to upload an image to a blog post, for example, examine the file for its file size first. If it's anywhere in megabyte (MB) territory, even just 1 MB, it's a good idea to use an image compression tool to reduce the file size before uploading it to your blog. Sites like TinyPNG make it easy to compress images in bulk, while Google's very own Squoosh has been known to shrink image file sizes to microscopic levels.
×