Robots.txt


It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty.

Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File
The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

“User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File
When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Tools to Generate and Validate a Robots.txt File
Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:

User agent: *

Disallow: /temp/

this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.

In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.


How to Build Backlinks


It is out of question that quality backlinks are crucial to SEO success. More, the question is how to get them. While with on-page content optimization it seems easier because everything is up to you to do and decide, with backlinks it looks like you have to rely on others to work for your success. Well, this is partially true because while backlinks are links that start on another site and point to yours, you can discuss with the Web master of the other site details like the anchor text, for example. Yes, it is not the same as administering your own sites – i.e. you do not have total control over backlinks – but still there are many aspects that can be negotiated.

Getting Backlinks the Natural Way
The idea behind including backlinks as part of the page rank algorithm is that if a page is good, people will start linking to it. And the more backlinks a page has, the better. But in practice it is not exactly like this. Or at least you cannot always rely on the fact that your contents is good and people will link to you. Yes, if your content is good and relevant you can get a lot of quality backlinks, including from sites with similar topic as yours (and these are the most valuable kind of backlinks, especially if the anchor text contains your keywords) but what you get without efforts could be less than what you need to successfully promote your site. So, you will have to resort to other ways of acquiring quality backlinks as described next.

Ways to Build Backlinks
Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome and the time you spend building them is not wasted. Among the acceptable ways of building quality backlinks are getting listed in directories, posting in forums, blogs and article directories. The unacceptable ways include inter-linking (linking from one site to another site, which is owned by the same owner or exists mainly for the purpose to be a link farm), linking to spam sites or sites that host any kind of illegal content, purchasing links in bulk, linking to link farms, etc.

The first step in building backlinks is to find the places from which you can get quality backlinks. A valuable assistant in this process is the Backlink Builder tool. When you enter the keywords of your choice, the Backlink Builder tool gives you a list of sites where you can post an article, message, posting, or simply a backlink to your site. After you have the list of potential backlink partners, it is up to you to visit each of the sites and post your content with the backlink to your site in it.

You might wonder why sites as those, listed by the Backlink Builder tool provide such a precious asset as backlinks for free. The answer is simple – they need content for their site. When you post an article, or submit a link to your site, you do not get paid for this. You provide them for free with something they need – content – and in return they also provide you for free with something you need – quality backlinks. It is a free trade, as long as the sites you post your content or links are respected and you don't post fake links or content.

Getting Listed in Directories
If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must – not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.

Forums and Article Directories
Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.

While forum postings can be short and do not require much effort, submitting articles to directories can be more time-consuming because generally articles are longer than posts and need careful thinking while writing them. But it is also worth and it is not so difficult to do.

Content Exchange and Affiliate Programs
Content exchange and affiliate programs are similar to the previous method of getting quality backlinks. For instance, you can offer to interested sites RSS feeds for free. When the other site publishes your RSS feed, you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.

Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?

News Announcements and Press Releases
Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites (for instance, here is a list of some of them) that publish for free or for a fee news announcements and press releases. A professionally written press release about an important event can bring you many, many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.

Backlink Building Practices to Avoid
One of the practices that is to be avoided is link exchange. There are many programs, which offer to barter links. The principle is simple – you put a link to a site, they put a backlink to your site. There are a couple of important things to consider with link exchange programs. First, take care about the ratio between outbound and inbound links. If your outbound links are times your inbound, this is bad. Second (and more important) is the risk that your link exchange partners are link farms. If this is the case, you could even be banned from search engines, so it is too risky to indulge in link exchange programs.

Linking to suspicious places is something else that you must avoid. While it is true that search engines do not punish you if you have backlinks from such places because it is supposed that you have no control over what bad guys link to, if you enter a link exchange program with the so called bad neighbors and you link to them, this can be disastrous to your SEO efforts. For more details about bad neighbors, check the Bad Neighborhood article. Also, beware of getting tons of links in a short period of time because this still looks artificial and suspicious.


Jadilah Facebooker dan Blogger Sejati


Jika sejumlah perusahaan dan instansi melarang penggunaan Facebook dan situs pertemanan lain di tempat kerja, tidak demikian dengan Australia. Pemerintah di Negeri Kanguru ini justru mendorong pegawainya aktif di dunia maya. Apa pasal?

Pemerintah di Negeri Kanguru ini mendorong pegawainya untuk aktif di dunia pertemanan maya, seperti di Facebook, Twitter serta blog. Alasannya untuk menghilangkan tembok pembatas antara pemerintah dan masyarakat umum.

Dunia maya dipercaya pemerintah Australia dapat menjadi jembatan antara aspek-aspek negara mereka. Lebih jauh lagi, bukan hanya memandang jejaring sosial sebagai hiburan semata, pemerintah Australia melihat peluang situs-situs tersebut sebagai tempat mendiskusikan ide-ide serta mendapatkan feedback.

Selain di Facebook, setiap instansi publik diharapkan pemerintah untuk lebih akrab dengan situs populer lain seperti situs ensiklopedia Wikipedia dan situs video YouTube. Untuk urusan blog, pemerintah melihat layanan ini sebagai wadah yang tempat bagi publik untuk menyalurkan komentar terhadap kebijakan yang mereka keluarkan.

"Akses ke jejaring seperti email dan pesan singkat membuka kesempatan yang kuat untuk menjalin kerja sama, terutama saat pihak-pihak yang diajak kerja sama terpisah secara fisik. Begitu juga dengan Twitter, Facebook, serta blog turut memberikan akses ke informasi yang penting dan membuka komunikasi," demikian beberapa pernyataan dalam draft yang disusun pemerintah Australia.


Membangun Backlink


Benarkah Backlink itu begitu penting agar kita sukses di dunia maya?
  • Pertama: Link sangat penting karena agar situs kita terjaring oleh robot dan akhirnya terindex di Search Engines. Selanjutnya tugas mereka adalah menentukan rangking sesuai dengan terms (kategori) situs atau blog.
  • Kedua: Link akan memperkokoh relevansi halaman blog kita. Dengan memperbanyak link dengan situs lain yang relevan berarti kita mempermudah halaman situs kita ditemukan oleh Google yang kemudian akan di-indeks dan di rangking. Jadi setiap Google melakukan inspeksi ke situs Tukar Link misalnya, maka ia akan menemukan link kita di sana. Akhirnya situs kita dianggap ngetop dan ditempatkan di rangking yang tinggi untuk terms (kategori) tertentu.

Untuk mengetahui lebih jauh, kita perlu mengetahui juga dua konsep penting yaitu Link Popularity dan Link Reputation.
Link Popularity menekankan pada berapa banyak link berkualitas yang menuju suatu blog atau web.
Link Reputation menekankan pada keywords apa saja yang dipakai dalam anchor text. Apakah ada relevansinya satu sama lain? Kalau saya boleh memberi istilah yaitu variasi kata kunci yang berkaitan. Variasi keywords berkaitan tersebut penting karena setiap pengunjung tidak selalu menggunakan kata yang sama sebagai keywords untuk mencari informasi lewat Search Engines. Misalnya: A menggunakan kata Situs Tukar Link sebagai keywords untuk mencari informasi Tukar Link. B menggunakan istilah Web Link untuk mencari informasi yang sama. Jadi informasi Tukar Link mempunyai variasi beberapa keywords seperti: situs tukar link dan web link. Jika sudah optimal, mau pakai keyword Situs Tukar Link atau Web Link, maka situs kita akan tetap nangkring di urutan atas (setidaknya halaman pertama) Google. Hebatkan?

Oleh karena itu kedua konsep itulah yang harus diperjuangkan? Pertama adalah mendapatkan sebanyak mungkin link berkualitas dan kedua adalah membangun banyak variasi kata kunci yang relevan untuk situs kita.

Nah, pertanyaan berikutnya yang muncul, bagaimana caranya untuk mempunyai backlink? Selain melakukan Tukar Link, tidak ada salahnya kita menuliskan sebuah artikel dan memberikan komentar pada Forum, artikel dari salah satu situs, terutama yang DoFollow supaya dianggap backlink.

Lalu apakah berarti kita harus mengecek satu per satu status situs yang akan kita beri komentar? mungkin iya. Tapi bila ingin menggunakan strategi lain, bisa menggunakan BacklinkWatch. Situs itu berfungsi untuk menghitung backlink dari sebuah URL dan menampilkan situs-situs dimana URL tersebut muncul sebagai backlink.


Membuat Template Blogspot Sendiri yuk..

Mempunyai Template yang tampil beda.. kenapa tidak? pasti menyenangkan mempunyai blog dengan Template buatan sendiri. Kita bisa berkreasi dalam menentukan layout, susunan blog, teks, warna dan masih banyak lagi yang bisa dituangkan.. terinspirasi dari itulah, saya mencoba mencari referensi untuk membuat template blogspot ini. Rupanya ada software untuk bikin template secara mudah, software ini memang harus berbayar alias tidak gratis, tapi kita masih bisa mencoba versi trial-nya kok.