10 Revealed Secrets To Get More Traffic For Our Blogs or Website

10 Revealed Secrets To Get More Traffic For Our Blogs or Website

Share on facebook
Share on twitter
Share on linkedin

There is a multitude of handbooks to increase traffic to our website, however, exist in most points with which I do not agree, or considerations that go unnoticed or are skipped altogether. With this book, I want to show the public 10 tips to improve our blog (always from my point of view). Most blogs with good visitor traffic (sometimes not even that, just indexed in Google), may obtain a graph similar to the following:


10 Revealed Secrets To Get More Traffic For Our Blogs or Website

Publish entries often

The more articles you publish, your blog will have more content, a broader range of possibilities they get more views. But Eye! It is not any kind of content, try to improve the blog by creating quality content (not you duplicate entries do not copy articles, generates own articles, etc. Every time you upgrade you can perform pings in many ping services for blogs as Pingoat or ping-o-matic. Use them only when you update! Before these pings were advisable to “warn” fresh and updated content. Today, simply signing up for Google Webmaster Tools to keep track of our updates on Google.

Optimize robots.txt file

The robots.txt file exists in the root of a good blog. Many search engine robots Network, visit this file to see which pages of the site are allowed and which are not indexed. In some cases, the only comment that optimising robots.txt file [in] have managed to increase traffic by more than 1400%. You can find an article that explains in detail that is the robots.txt file as used in Robots.txt: Everything you should know.

Controls with statistics

If there is a section that every blogger is usually controlling this. Have at least one service statistics to monitor how many users come to your blog because of keywords (useful to know if you’re using good post titles if they fit with the content of your blog) that browsers, etc. My favourite (and free) Google Analytics and StatCounter.
Also, if you have Google Analytics you can apply a little trick to control the number of clicks on Adsense banners by day, week or even from that page.

Positioning / SEO

Perhaps one of the most complex sections, to which he/she will less ignore without the doubt the most important. Google (and other search engines) have a number of “rules” to rank pages in order of importance, the better you follow the better results obtained when positioning yourself on Google.

Here are the best known:

  • Use headings:  A google loves you use headers h1, h2, h3, … in the HTML of your page titles and to highlight important areas.
  • Page titles scripted:  Try changing the titles of your pages to things like page.com/art/google. Pages with page.com/?id=20&titulo=google titles do not like and do not follow them.
  • Metatags:  Use responsibly metatags. Although many believe should not be used, I think the search engines are very sensitive (depending on which).
  • Bold:  Anecdotally, the b tag HTML (bold, deprecated by the W3C) is much more powerful than the strong tag.
  • Page title: Google gives great importance to the title page. Choose good words to include in it. I still see pages that have things like title Untitled document or page.

Other tricks positioning

SEO in Google 
Here, are a small manual in Google SEO. With it, you can get appear at the top of the results. Remember that you can only be the first position if you try too hard.
Search engine optimisation has become in recent years one of the most desirable tools for webmasters. In particular, appear in the top positions in Google is a must if you want to generate traffic to your website because Google responds to almost 60% of Internet searches.
The tutorial that we offer will help you get a good SEO, but do not expect results in a few days or a few weeks, or perhaps months. It is a constant work -generator content, optimising web pages and bookmark the results are looking long term.

1. Genuine content.

This is the main point, not only to appear in the top positions in Google but to get people to visit your website.

2. Technology

Stay informed of the latest developments regarding servers or programming languages. They will make the task of managing the content much more enjoyable, and you can “more like Google”.

3. Simplicity

The robot of Google does not like pages with excessive decorations and just looking legible text and clear content.

4. High in Google.

The first step to getting a good position in Google is to appear in the search.

5.  Getting links.

It is essential to obtain a high PageRank pillar. Try to appear in the main directories (Yahoo! and DMOZ) and you link from other websites.

6. Avoid penalties

Google knows that many people will try to cheat, and is beginning to take action against websites that perform unethical practices to improve your rankings. Find out what they are and how to avoid them.

7.  Getting Help

The positioning requires being constantly informed. Join our forum and the Google Dance.

8. Terms SEO.

Certain terms and words with which you can easily understand all documentation relating to SEO.

Avoid spam techniques

Something that Google gives a lot of importance is to popularity. Get many links on other pages that Google will make up in its “ranking” of positions. This, on the one hand, is a good evaluation system, but on the other hand is a very tempting feature for all spammer, navigating from page to page leaving links to your own site. My advice is to avoid as much as possible to perform this type of spam. Leave comments on blogs in an intelligent way (when you have something to contribute, do not do it just because to leave your link) provides signatures on forums or newsgroups to your blog (in decent shape, nothing to expand the size of the text or put gaudy or large image to see), trackbacks (only when necessary and objective), etc … I, for example, hate sharing links. However, Google gives importance if the theme is similar and the links are natural. My advice is to use link exchange only and exclusively when the site to which you link and interests and you like when other people do not propose an exchange. So really favour quality content and do not promote spam. Another very important factor according to some polls is the reputation of the author. Obviously, if you’re a spammer who goes by many pages of links leaving spam, do not take long to lose credibility. All this without mentioning the rel = nofollow have many pages (in the comments, for example). This attribute does nothing but warns that Google does not follow the links, ideal for avoiding spam comments, as they generally seek to increase their reputation getting links.

Sets RSS feeds RSS

The feed is a very novel and interesting invention. This is a file with information on the latest articles published on the web. So anyone can get through that file manager and all the RSS feeds of websites owned, see which were updated at once, without having to go to each site independently. As a blogger, you must find a way to provide a system of feed (Atom or other) so that readers can receive site updates comfortably. This will increase your number of visits almost certain. If you’re a blogger be sure to check the option to enable this type of files. Interesting related services may be subject FeedBurner or FeedBlitz. Since RSS feeds managers as Bloglines or Google Reader you can view the number of subscribers you have your blog.


Site Maps If you’re starting out can be a wait for the robot to go through Google to index your site, as the Google bot has to “find” a link to your blog (and desperate!) another page and then go to check it out. However, there is a relatively simple way to tell Google all pages of your site and speed up the system in a very fast way. It is generating a site map (in RSS format, cited above) to access Google that file and see the links that exist if they change the index and re-index it. This is done by generating a list of all the articles and sections of the page, with the date of last update (with optional parameters).
In Google Webmasters, you can perform various actions related Tools Sitemaps to Google, Submit site to Google and even a discussion forum among other things Google. The generation of the sitemap is very simple, but if you want to complicate your life, you can get a free program like VIGOS Gsitemap that generates the file automatically. For other search engines like Yahoo! also has a system of sitemaps.

Quality content

As mentioned above, the content of a blog is a very important thing. Quality content will not only attract people interested in articles written but also increase visits and search engines also will be interested in your site. Always good to write about a subject that is mastered, but need not be essential. One thing that is essential is a correct wording. Readers of a website is a lot of interest (and quite attracted to) the spelling of the author. Faults are unforgivable spelling or grammar. To avoid this, there is a great complement to the Firefox browser: spelling dictionaries. They will remarcarán misspelt words, as when you edit a Microsoft Word document. Accessible and optimisation very important point. Many of the sites I visit (attention all users of MSN Spaces or Blogger) contain an awful lot of content on your home page (probably due to the inexperience of the authors). A home page (as well as any particular item) must not exceed a certain total weight set of all files (text, images, multimedia, etc …). For example, some areas of MSN displayed once the last nine entries. Each entry has 4 or 5 images that occupy about 50kb each. In principle this would not cause any problems, but (50 * 4) * 9 = 1,800.Nearly 2 megs of the page! Imagine if the poor user who wants to see it has the 56Kb modem (connecting 4,5kb / s).Would need nearly 8-minute wait to have loaded the page and begin viewing! For this, there is, once again, a Firefox extension called Life of request info. When you have it installed and load a page, at the bottom of the browser will scroll information:


Wait since the user clicks until it starts airing information. Charging time: Wait since received information begins to load until the web is complete. Page size: text, images, multimedia, etc … A number of requests: Files linked The last three are the ones that most interest us. Acceptable values could be: Latency between 0-3seg (over could be due to misconfiguration or PC connection very slow web server) (the less the better, over 45 seconds, rather slow web) web 0-45seg Load Size 15 -200kb (ask yourself if it is greater the lower the size of your page: fewer images, fewer entries / articles, etc …) Realise that the more size, the longer it takes to load. Requests between 15-200 (over 100 requests a page is considered pretty loaded content). Obviously, these data are only indicative. If you are interested in web optimisation, here’s a lecture I gave in Tenerife LAN Party 2012 on techniques to improve the speed, position and performance of your website.


Finally, and by no means least: the web page design. Many authors consider secondary, but for me, it is so primal as any other point. Personally, I think the graphic design of a blog or web page should be different (identical designs or templates used very little impression give freshness). It should be eye-catching but not burdensome, use colours that do not tire the view, but not very monotonous and very colourful and appealing images. Article Image formats: Optimisation Guide you have a complete and detailed list of image formats, with their characteristics and features, ideal for minimal size (faster and download) possible image) with the highest possible quality. You can also access this list 12 applications to optimise images. It is also important to note that it would be good to respect the standards set by the W3C, that although Google is not very strict with this, the simpler and better process the page, the better you “marked out”. Do not forget accessibility: Javascript not abuses and provide alternative methods for browsers without javascript, try different resolutions and browsers, and ensure that all navigable page. If in addition, we provide a version for mobile devices, PDA and so on, unbeatable.