Get Traffic from Facebook


Facebook is not the first social network but it is the most popular one. There have been many other social networks before Facebook and while some of them were popular at some point in time, none could reach the popularity of Facebook. In addition to keeping in touch with your friends, Facebook can be (and is) used for business. You can use it to promote your products and services, to acquire new clients, or to get traffic to your site.

Like Twitter, Facebook is just one of the many ways to get some traffic to your site. Many marketers believe that it is just a matter of time for the traffic from Facebook, Twitter and the other major social networking sites to surpass the traffic their sites get from Google.

While this time might come, don't take this as a promise that even if you do everything right, Facebook, Twitter, or any other similar site will do traffic miracles for you. For some people Facebook works like a charm, for others it doesn't work at all. The same applies to Twitter. You can't know in advance if Facebook and/or Twitter will crash your server with traffic. Just try both and see which one (if any) works for you.

Unlike Twitter, which is very simplistic, Facebook offers more possibilities. Yes, you might need more time in order to explore all the possibilities and take advantage of them but hopefully these efforts will have a great return in terms of traffic. Here are some tips that can help you turn Facebook into a traffic monster:

1. Your profile is your major weapon
As with Twitter and any other social network, if you don't make your profile interesting, you will hardly become popular. Give enough background information for you and don't forget to make your profile public because this way even people, who don't know you, when they encounter your profile, they might become interested in you and become a supporter of yours.

2. Include information about your site on your Wall and in the photo gallery
Facebook gives you the opportunity to write a lot about you and your endeavors, as well as to include pictures, so use all these opportunities to build interest in you and your products. It is even better to post videos and fill in the other tabs, so if you have something meaningful to put there, just do it.

3. Build your network
As with other social networking sites, your network is your major capital. That is why you need to invite your friends, acquaintances, and partners and ask them to join as your supporter. You should also search for people with interests similar to yours. However, don't be pushy and don't spam because this is not the way to convince people to join your network.

4. Post regularly
No matter how interesting the stuff in your Facebook profile is, if you don't publish new content regularly, the traffic to your Facebook profile (and respectively the Facebook traffic to your site) will slow down. If you can post daily, it is fine but even if you don't post that regularly, try to do it as frequently as you can. If nothing else, updating your status regularly is more than nothing, so do it.

5. Be active
A great profile, an impressive network, and posting regularly are just a part of the recipe for success on Facebook. You also need to be active – visit the profiles of your supporters, take part in their groups and other initiatives, visit their sites. You are right that all this takes a lot of time and you might soon discover that Facebooking is a full-time occupation but if you notice an increase in traffic to your site, then all this is worth.

6. Arrange your page
Unlike other social networks, Facebook gives you more flexibility and you can move around many of the boxes. If you put the RSS feed with the links to your blog in a visible space, this alone can generate lots of traffic for you.

7. Check what Facebook apps are available
Facebook apps are numerous and new and new ones are released all the time. While many of these apps are not exactly what you need, there are apps, which can work for you in a great way. For instance, MarketPlace widget/plugin or Blog Friends widget are very useful and you should take advantage of them. You can also use the widgets for crossposting (i.e. posting directly on Twitter from Facebook) because this saves you time.

8. Use Facebook Social Ads
If you can't get traffic the natural way, you might consider using Facebook Social ads. These are PPC ads and starting a campaign is similar to an Adwords campaign.

9. Start a group
There are many groups on Facebook but it is quite probable that there is a free niche for you. Start a group about something related to your business and invite people to join it. The advantage of this approach is that you are getting targeted users – i.e. people, who are interested in you, your product, your ideas, etc.

10. Write your own Facebook extensions
While this step is certainly not for everybody, if you can write Facebook extensions, this is one more way to make your Facebook profile popular and get some traffic to your site.

11. Use separate profiles
Unfortunately, social networks do expose a lot of personal information and you are not paranoid, if you don't want so much publicity. Many people are rightfully worried about their privacy on social network sites and that is why it is not uncommon to have one personal profile for friends and one business profile to promote their business. You can have one single profile for both purposes, but if you have privacy concerns, consider separating this profile in two – you'd better be safe than sorry.
Facebook is changing all the time and no matter how hard you try to follow these changes, there will be new and new possibilities for you to explore. That is why it is not possible to compile a complete list of all the tactics you can use in order to drive traffic from Facebook to your site. Anyway, if you try just the basics for Facebook success we listed here, chances are that you will see a considerable traffic increase.


Robots.txt


It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty.

Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File
The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

“User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File
When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Tools to Generate and Validate a Robots.txt File
Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:

User agent: *

Disallow: /temp/

this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.

In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.


How to Build Backlinks


It is out of question that quality backlinks are crucial to SEO success. More, the question is how to get them. While with on-page content optimization it seems easier because everything is up to you to do and decide, with backlinks it looks like you have to rely on others to work for your success. Well, this is partially true because while backlinks are links that start on another site and point to yours, you can discuss with the Web master of the other site details like the anchor text, for example. Yes, it is not the same as administering your own sites – i.e. you do not have total control over backlinks – but still there are many aspects that can be negotiated.

Getting Backlinks the Natural Way
The idea behind including backlinks as part of the page rank algorithm is that if a page is good, people will start linking to it. And the more backlinks a page has, the better. But in practice it is not exactly like this. Or at least you cannot always rely on the fact that your contents is good and people will link to you. Yes, if your content is good and relevant you can get a lot of quality backlinks, including from sites with similar topic as yours (and these are the most valuable kind of backlinks, especially if the anchor text contains your keywords) but what you get without efforts could be less than what you need to successfully promote your site. So, you will have to resort to other ways of acquiring quality backlinks as described next.

Ways to Build Backlinks
Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome and the time you spend building them is not wasted. Among the acceptable ways of building quality backlinks are getting listed in directories, posting in forums, blogs and article directories. The unacceptable ways include inter-linking (linking from one site to another site, which is owned by the same owner or exists mainly for the purpose to be a link farm), linking to spam sites or sites that host any kind of illegal content, purchasing links in bulk, linking to link farms, etc.

The first step in building backlinks is to find the places from which you can get quality backlinks. A valuable assistant in this process is the Backlink Builder tool. When you enter the keywords of your choice, the Backlink Builder tool gives you a list of sites where you can post an article, message, posting, or simply a backlink to your site. After you have the list of potential backlink partners, it is up to you to visit each of the sites and post your content with the backlink to your site in it.

You might wonder why sites as those, listed by the Backlink Builder tool provide such a precious asset as backlinks for free. The answer is simple – they need content for their site. When you post an article, or submit a link to your site, you do not get paid for this. You provide them for free with something they need – content – and in return they also provide you for free with something you need – quality backlinks. It is a free trade, as long as the sites you post your content or links are respected and you don't post fake links or content.

Getting Listed in Directories
If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must – not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.

Forums and Article Directories
Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.

While forum postings can be short and do not require much effort, submitting articles to directories can be more time-consuming because generally articles are longer than posts and need careful thinking while writing them. But it is also worth and it is not so difficult to do.

Content Exchange and Affiliate Programs
Content exchange and affiliate programs are similar to the previous method of getting quality backlinks. For instance, you can offer to interested sites RSS feeds for free. When the other site publishes your RSS feed, you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.

Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?

News Announcements and Press Releases
Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites (for instance, here is a list of some of them) that publish for free or for a fee news announcements and press releases. A professionally written press release about an important event can bring you many, many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.

Backlink Building Practices to Avoid
One of the practices that is to be avoided is link exchange. There are many programs, which offer to barter links. The principle is simple – you put a link to a site, they put a backlink to your site. There are a couple of important things to consider with link exchange programs. First, take care about the ratio between outbound and inbound links. If your outbound links are times your inbound, this is bad. Second (and more important) is the risk that your link exchange partners are link farms. If this is the case, you could even be banned from search engines, so it is too risky to indulge in link exchange programs.

Linking to suspicious places is something else that you must avoid. While it is true that search engines do not punish you if you have backlinks from such places because it is supposed that you have no control over what bad guys link to, if you enter a link exchange program with the so called bad neighbors and you link to them, this can be disastrous to your SEO efforts. For more details about bad neighbors, check the Bad Neighborhood article. Also, beware of getting tons of links in a short period of time because this still looks artificial and suspicious.


Jadilah Facebooker dan Blogger Sejati


Jika sejumlah perusahaan dan instansi melarang penggunaan Facebook dan situs pertemanan lain di tempat kerja, tidak demikian dengan Australia. Pemerintah di Negeri Kanguru ini justru mendorong pegawainya aktif di dunia maya. Apa pasal?

Pemerintah di Negeri Kanguru ini mendorong pegawainya untuk aktif di dunia pertemanan maya, seperti di Facebook, Twitter serta blog. Alasannya untuk menghilangkan tembok pembatas antara pemerintah dan masyarakat umum.

Dunia maya dipercaya pemerintah Australia dapat menjadi jembatan antara aspek-aspek negara mereka. Lebih jauh lagi, bukan hanya memandang jejaring sosial sebagai hiburan semata, pemerintah Australia melihat peluang situs-situs tersebut sebagai tempat mendiskusikan ide-ide serta mendapatkan feedback.

Selain di Facebook, setiap instansi publik diharapkan pemerintah untuk lebih akrab dengan situs populer lain seperti situs ensiklopedia Wikipedia dan situs video YouTube. Untuk urusan blog, pemerintah melihat layanan ini sebagai wadah yang tempat bagi publik untuk menyalurkan komentar terhadap kebijakan yang mereka keluarkan.

"Akses ke jejaring seperti email dan pesan singkat membuka kesempatan yang kuat untuk menjalin kerja sama, terutama saat pihak-pihak yang diajak kerja sama terpisah secara fisik. Begitu juga dengan Twitter, Facebook, serta blog turut memberikan akses ke informasi yang penting dan membuka komunikasi," demikian beberapa pernyataan dalam draft yang disusun pemerintah Australia.


Membangun Backlink


Benarkah Backlink itu begitu penting agar kita sukses di dunia maya?
  • Pertama: Link sangat penting karena agar situs kita terjaring oleh robot dan akhirnya terindex di Search Engines. Selanjutnya tugas mereka adalah menentukan rangking sesuai dengan terms (kategori) situs atau blog.
  • Kedua: Link akan memperkokoh relevansi halaman blog kita. Dengan memperbanyak link dengan situs lain yang relevan berarti kita mempermudah halaman situs kita ditemukan oleh Google yang kemudian akan di-indeks dan di rangking. Jadi setiap Google melakukan inspeksi ke situs Tukar Link misalnya, maka ia akan menemukan link kita di sana. Akhirnya situs kita dianggap ngetop dan ditempatkan di rangking yang tinggi untuk terms (kategori) tertentu.

Untuk mengetahui lebih jauh, kita perlu mengetahui juga dua konsep penting yaitu Link Popularity dan Link Reputation.
Link Popularity menekankan pada berapa banyak link berkualitas yang menuju suatu blog atau web.
Link Reputation menekankan pada keywords apa saja yang dipakai dalam anchor text. Apakah ada relevansinya satu sama lain? Kalau saya boleh memberi istilah yaitu variasi kata kunci yang berkaitan. Variasi keywords berkaitan tersebut penting karena setiap pengunjung tidak selalu menggunakan kata yang sama sebagai keywords untuk mencari informasi lewat Search Engines. Misalnya: A menggunakan kata Situs Tukar Link sebagai keywords untuk mencari informasi Tukar Link. B menggunakan istilah Web Link untuk mencari informasi yang sama. Jadi informasi Tukar Link mempunyai variasi beberapa keywords seperti: situs tukar link dan web link. Jika sudah optimal, mau pakai keyword Situs Tukar Link atau Web Link, maka situs kita akan tetap nangkring di urutan atas (setidaknya halaman pertama) Google. Hebatkan?

Oleh karena itu kedua konsep itulah yang harus diperjuangkan? Pertama adalah mendapatkan sebanyak mungkin link berkualitas dan kedua adalah membangun banyak variasi kata kunci yang relevan untuk situs kita.

Nah, pertanyaan berikutnya yang muncul, bagaimana caranya untuk mempunyai backlink? Selain melakukan Tukar Link, tidak ada salahnya kita menuliskan sebuah artikel dan memberikan komentar pada Forum, artikel dari salah satu situs, terutama yang DoFollow supaya dianggap backlink.

Lalu apakah berarti kita harus mengecek satu per satu status situs yang akan kita beri komentar? mungkin iya. Tapi bila ingin menggunakan strategi lain, bisa menggunakan BacklinkWatch. Situs itu berfungsi untuk menghitung backlink dari sebuah URL dan menampilkan situs-situs dimana URL tersebut muncul sebagai backlink.