Tuesday, 30 August 2011

Tips for running a Successful Social Media Campaign?

Running a social media contest or competition can generate you some quick fans, and build your email list up, and drive quality, qualified, interested traffic to your site.Without a good strategy in place, your hard-earned effort might well be going to waste. Here are 5 strategies to turn your contest into a valuable, long term, marketing machine.

Begin With A Social Media Plan

Competitions should only represent one part of your social media marketing plan. If you don’t already have a strategy in place, if you fail to plan, you plan to fail.

A plan basically involves:

  • A clear goal as to what you want to achieve
  • A calendar for your content
  • Targets for your outcomes
  • A swipe file of your content ideas and mind maps.
Have A Clear Goal

To make sure you’re running a profitable campaign, a clear goal is essential. A social media promotion will generally achieve one or more of the following:

  • Sales
  • Fans
  • Email Subscribers
  • PR/Brand Awareness
Define what you want to achieve before you begin to run your competition, design your competition message and follow-up campaign based around your desired outcomes.

Setting a clear goal also helps you keep track of your performance, and make sure your contest is performing to your expected targets.

Use A 3rd Party App To Run Your Contest

Unless you have an awesome in-house programming team, and a tonne of cash to dedicate to developing your own contest application, use a third party application to host your contest.

This helps you comply with the social media platform’s terms of service and some platforms give you a range of tools to help you begin engaging with your fans.

Some of the popular Contest Applications are:
A wide variety of third-party apps offer promotion services that meet Facebook’s rules. Check out any of the following:
 Fanappz
 Vitrue
 BuddyMedia
 Votigo
 ContextOptional
 BulbStorm
 NorthSocial
 Momentus Media
 Friend2Friend
 Strutta


Live Video
 Vpype  
 Livestream
 JustinTV
 UstreamTV
 Linqto (going live January 2011 – this should be a very popular webinar/livestream app!)


Binkd Promotion - Free to use. Sweepstakes, Photo Contests, and Challenge Contests available for WordPress and Facebook
Wildfire App - Facebook Sweepstakes, Photo Contests, and Vote to Win
Bulbstorm - Facebook Sweepstakes, Photo Contests, and Vote to Win
Activate Your Marketing Machine

Unless you have an existing list of thousands of engaged fans, your contest is most likely going to need a bit of a push to get it going.

To get the most benefit out of your campaign, you should market across a number of platforms. Some strategies to consider are:

  • Your email list
  • Your existing social media platforms (Twitter, Facebook, LinkedIn etc)
  • Your blog and website
  • Joint venture partner’s lists
  • Radio or Television advertising
  • Print or paid web advertising
Have Your Site Ready To Go
A well run and popular contest will send traffic to your site. Chances are, if they have taken the time to visit your site, this traffic is qualified and interested in your product.

Be sure to have your website up to date before you launch your competition.

Make sure all links work
Have your opt in links clearly visible and accessible
Ensure that you have your competition branding clear and visible on your page
Make sure it’s easy to find and buy your product
Make sure your About Us and Contact Us pages are up to date and working.
Have A Powerful Follow-up

The fortune in any marketing campaign is in the follow-up. Your well run contest will collect you a wave of new, qualified, engaged and interested subscribers.

The most valuable people in your contest are not actually the winners, they are the people who didn’t win. Prepare your auto-responder campaign beforehand, encouraging your new fans to take further action in the competition.

During the social media competition

Give them tips on how they can increase their chances of winning
Teach them how to share your content
Engage with them on social media and really get to know them
Publish valuable, relevant, and shareable content.
A competition with a coveted prize will encourage a lot of interaction on the social media platform you run your competition on. Make sure you get in on the conversation, and actively chat and engage with your fan base.

After the social media competition

Keep the conversation open, with post-competition specials, offers, and valuable content.
Ensure you give them enough reason to stay on your list
Offer follow-up bonuses simply for having been a part of the contest.
Now, Go And Build Your Community

Your awesome contest should have driven a new tribe of interested, engaged fans to your social media platforms – which on its own is rather cool… but you can take it one step further.

Making use of the hype and excitement surrounding your contest, begin to build your community. A community is a place where your fans don’t only respond to your questions, articles, and updates, they also actively seek you out to engage you and they seek one another out to engage.

Leveraging the warmth from your competition will help you position your company and brand as a trusted leader, a resource, and the go-to company in your niche.


7 Things you should consider while you are on Twitter

Anger: How often do you see people saying something out of anger on Twitter and you just know they will regret it? If you are feeling angry go for a walk or find a punching bag and leave Twitter out of it.

Greed: This shows up when tweeps are greedy with others’ time (tweeting too much in too short a time) or following too many people in hopes that some will follow back and build their follower numbers. The old saying that "the more you want something, the more it eludes you" is true on Twitter too!

Laziness: People who auto-tweet and don’t engage (respond to questions and/or thank others’ kindness on Twitter) are usually easy to spot … and as a result they do not build communities of value.

Pride: Words on a screen can so easily be misinterpreted without the visual or audio cues we get from other forms of communication. Tweeps who fly off the handle at a perceived slight are victims of pride and likely revealing a lot more about themselves than they realize. If you always assume good intent until undeniable evidence to the contrary you will do fine on Twitter.

Desire: This manifests itself in Twitter users who are too anxious to achieve their goals on the network and spend a lot of time pursuing them oblivious to what is happening around them on Twitter. This narcissism is rarely rewarded.

Envy: This is one of the ugliest sins and shows up when a Tweep decides to use Twitter to tear down another Tweep (usually someone with more standing on the network). If you’re turning green with envy over how well others are doing on Twitter learn from them – don’t attack them.

Voracity: The "voracious" Tweep is that person who retweets (RTs) and comments indiscriminately – usually dozens of times in a short time period. For 10-30 minutes at a time Twitter is, in their minds, all about them.

So be careful while using those 140 charcters

Monday, 29 August 2011

Blocking IPs through Analytics- A Graphical Guide

Choose the correct profile, then click on the “Filters” tab



Within the Filters tab, click the “+New Filter” button.  This is where you’re going to enter the details you’ve collected during your diligence above.
Now, there are quite a few ways to exclude or include data here. The simplest way is excluding by IP address. You can also see there are ways to set up profiles that only include data from certain referrers, or data from subdirectories on the site.
You’d use this in a separate profile that only tracked that data, that’s a discussion for another time. For today’s exercise, we’re going to exclude traffic by IP address.




Make sure you name each filter something descriptive.
“In Office IP” or “Bob’s at home IP” you want it to be easy to make changes in the future if someone changes their ISP or is no longer employed by the company. Fill in the information and save each filter. Every IP address needs a different filter, so this could take some time to set up, but you’ll see a difference when you’re looking at honest data when you’re done.
The next thing I’d do is add an annotation to your data graph to remind anyone who looks at the data that employee & contractor IPs were removed on a specific date. This is really easy,  a few clicks from the analytics homepage and you’re done.
Underneath the graph on your “My Site” Tab in the orange bar, click the down arrow.


Click “Create new annotation”
Then enter the date, your note, and click save


You can now see the annotation in the graphic, and you can see how excluding the IP address affected all of your data.


Tracking your ad position with ValueTrack

ValueTrack Overview
ValueTrack is our easy-to-use AdWords URL-tagging feature. With it, you can get detailed data about each click on your ad and feed this data into your own tracking solution. You can use that information to adjust your targeting criteria to optimize your return on investment.

How it works

When a user clicks your ad and visits your website, ValueTrack can record a number of pieces of information: the website where the user clicked your ad, the specific ad creative a user was shown, which keyword triggered your ad, and more. Implementation is easy: just add a tag to your ad's destination URL.

Tracking your ad position with ValueTrack

Starting now, your destination URL can include the {adposition} parameter. The {adposition} parameter works for search campaigns and google.com. Here's a quick example of how you might use this parameter:

Let's say you have a search campaign. If your website is www.example.com, you can use the new and existing ValueTrack parameters in your AdWords campaign to set the destination URL to: http://www.example.com/?adpos={adposition}.

When you receive a click, your display URL will show in your logs with the ad position replaced with values such as:
“1t2” = page 1, top, position 2
“1s3” = page 1, side, position 3


In cases where we cannot return this info (e.g. Display Network), we will show “none”.

Here are examples of what you may see in your logs:

http://www.example.com/?adpos=1t2
http://www.example.com/?adpos=1s3
http://www.example.com/?adpos=none

Learn with google- A new website. good learning Resource


Learn with Google, a new educational site designed to be a one-stop shop for businesses to learn about Google products and services like AdWords, Google Apps, Places, Analytics, and more. Whether you’re looking to grow your business or just be more productive,  this collection of short videos, handouts, and resources will arm you with best practices and tips that you can use right away.

On the site, you’ll learn about:

Starting Your Business Online: Learn why having an online presence is critical for growing your business and how to create an online marketing plan. Discover how local marketing tactics can help you reach customers near you.

Marketing Your Business Online: Master the basics of online marketing and AdWords to get more for your money. Learn how to choose the right keywords, write compelling ads, and optimize your advertising spend.

Running Your Business Online: Learn how the Google Apps suite of online communications and productivity tools, like Gmail and Google Docs, can make your team more collaborative and your business more efficient.

AdWords Editor Version 9.5 for Windows and Mac now available

Campaign Experiments

You can now do the following to maintain your campaign experiments:
  • Apply and edit an experiment status (e.g. “control only”, “experiment only”, “control and experiment”) at the ad group, ad, or keyword level.
  • Apply and edit a Default Max. CPC, Display Network Max. CPC, or Max. CPM bid multiplier at the ad group level.
  • Apply and edit a Max. CPC bid multiplier at the keyword level.
  • Import and export experiment status and bid multipliers in both CSV and XML formats.
Location Extensions

Version 9.5 supports new and existing location extensions. You can create new manual location extensions for any address, modify existing locations, and download/upload location extensions in CSV and XML formats.

Background Download

If you’re working on several large accounts, you can now download them in the background while you’re working on another open account. This can be a great timesaver by allowing you to continue working rather than waiting for the download to finish.

We also listened to your feedback and made several usability improvements, including improved revert functionality and streamlined Add Multiple Items workflow.
The next time you log into your AdWords Editor account, you will be prompted to upgrade. You may also download Version 9.5 from the AdWords Editor website. After you install the new version of AdWords Editor, you will need to download your accounts again. To preserve your comments and unposted changes, select the Backup then Upgrade option in the automatic upgrade prompt and save the backup file to your computer. Then, re-download your account and import the backup file to AdWords Editor. A small portion of users may need to manually uninstall the previous version of AdWords

MIXRANK- A good tool for doing Competitive Analysis

Step 1: Pick a Competitor to Research
MixRank makes is super easy to get started. Just start typing in a domain name and you'll see a suggested list of names along with the amount of ads available:


MixRank breaks their tool down into 2 core parts:
Ads (text and display)
Traffic Sources

Step 2: Working with Ad Data (Text and Display)
Let's start with text ad options. So with text ads you have 3 areas to look at:
Active Ads
Ad Reach
Best Performers 







The image above is for "Active Ads". In the active ads tab you'll get the following data points (all sortable):
■Publishers - maximum number of AdSense publishers running that particular ad
■Last Seen - last known date the ad was seen by MixRank
■Frequency - amount of publisher sites on which the ad appeared
■Avg. Position - average position of the ad inside AdSense blocks
 






If the advertiser is running Banner Ads you can see those as well:

When you click on a banner ad you'll see this:



Step 3: Traffic Sources
Now that you have an idea of what type of text ads and banner ads are effective for your competition, it's time to move into what sites are likely the most profitable to advertise on.
MixRank gives you the following options with traffic sources:
Traffic Sources - domains being advertised on, last date when the ad was seen, average ad position and number of days seen over the last month
Reach - total number of publishers the advertiser is running ads on


 The traffic sources tab shows:
Uniques - estimated number of unique visitors based on search traffic estimates
Last Seen - last date MixRank saw the ad
Days Seen - number of days over the last month MixRank saw the ad
Average Position - average position in the AdSense Block

Facebook to Top Yahoo! in US Display Market, As Google Looms






Facebook will pass Yahoo! as the leader in the US online display advertising market this year, according to new estimates by eMarketer.
The social network's share of the $10.1 billion US display ad market will grow to 21.6% of overall revenues in 2011, up from a 13.6% share in 2010 and a 7.3% share in 2009.

Top 20 Brands on Facebook

1.Coca Cola (31,762,653)
2.Disney (26,613,752)
3.Starbucks (23,574,606)
4.Oreo (21,864,091)
5.Red Bull (21,220,373)
6.Converse All Star (19,880,308)
7.Converse (18,977,840)
8.Skittles (18,386,827)
9.Playstation (16,245,633)
10.iTunes (15,862,234)
11.Pringles (14,765,300)
12.Victoria’s Secret(14,384,903)
13.Window’s Live Messenger (13,926,945)
14.Ferrero Rocher (11,676,898)
15.Monster Energy (11,492,620)
16.Nutella (10,696,260)
17.iPod (10,530,905)
18.Adidas Originals (10,433,947)
19.Xbox (10,388,218)
20.Dr Pepper (9,927,828)

14 Best Practices for Brands to Grow their Audiences in Social Media


1. Design an Effective Channel Strategy: Evaluate the main brand, sub brands, and notable personalities that require a “follow worthy” or “likable” presence. If there are other accounts that exist beyond the initial strategy, assess their value as a standalone channel and its current state. It may be best to simple truncate accounts or close them all together.

2. Create a Life Support System: Develop an organized framework that supports each presence uniquely. Ensure that each account establishes a rhythm that meets the needs of its audience.

3. Mission and Purpose: Know the audience you’re trying to reach and design a communicable mission and purpose for each account.

4. Develop an Editorial Program: Create an editorial program that addresses the various needs of the social consumer including entertainment, sales, service, engagement, HR, etc. Evoke the new K.I.S.S. (Keep It Significant and Shareable). Create content that’s both engaging, contextually relevant, and shareable. Think beyond the basics such as polls, curation, promotional content, questions.

5. Construct a Listening Framework: The best listeners make the best conversationalists. Build a listening framework that monitors the brands as well as the distinct conversations related to each account.

6. Establish Conversational Workflow: Each account requires an information path and workflow. They also require bridges between them to ensure that every representative is informed and that the right delegates within the business are on point to engage or respond accordingly.

7. Formulate a Decision Tree: Draft a clear flowchart that details the steps for a variety of “if this happens, then do this” situations. This is designed to help representatives follow a pre-defined path for the real-time nature of engagement.

8. Initiate a Training Program: Representatives will require ongoing training to stay sharp and focused. Every engagement either reinforces or takes away from the brand experience. As technology moves faster than our ability to master its lessons, training keeps employees on track.

9. Install a Governance and Reward System: Much like the marketing team protects the integrity of the brand and how it’s presented, a social team is necessary to manage the integrity of each Twitter account as well as the overall portfolio. At the same time, a reward system must be put in place to encourage exceptional work.

10. Draft a Social Media Brand Style Guide: Chances are a style guide already exists that communicates brand presentation, usage guidelines, and other forms of brand-related marketing aesthetics. This guide requires a significant update to account for social media. Its primary function is to define the brand persona, characteristics, voice, and essence. Additionally, the updated style guide will define the design of each presence and how represents should accurately enliven it through narrative.

11. Compose Guidelines and Do’s and Don’ts: Develop a social media policy that conveys the do’s and don’ts in social media. If one already exists, update it. The law has changed and now protects employee rights to express opinion about employers within their personal accounts. Additionally, many employees complain that the existing guidelines are either too extreme or ambiguous to define successful engagement. Design the guideline to serve as guardrails and also a roadmap to success.

12. Serve Customers and Prospects: Social consumers now expect brands to solve problems and answer questions in social streams. Each channel requires a service function or a dedicated channel to satisfy needs and promote appreciation and loyalty.
13. Employ Language and Timing Techniques: Two points of note, timing is everything and in brevity there’s clarity. Studies already show that the time and day and the language structure of Tweets and Facebook updates determine overall reach and engagement. Optimize language and timing to make every update count.

14. Design Engagement and Performance Metrics: Monitor the performance of each account to improve the engagement and editorial strategy for each account.

Following these best practices will prevent your brand from falling victim to the coming wave of customer unlikes and unfollows. But more importantly, focusing social channels and investing in the value of each will improve the customer experience and encourage greater engagement. By increasing meaningful interaction, brand reach is dramatically amplified through the social effect, encouraging customers to not only Like the brand, but genuinely love it!

Originally Published in Mashable-

An On-page Optimization SEO Checklist



 1. Try to use your main keyword in your domain name, but do not use too long domain names, if possible.

 2. Ensure to Use Keyword Rich Anchor Text for your onpage navigation and internal linking, so not use phrases like “click     here” or “read more”. Ensure to use do follow internal links spread inside your content, linking to other related pages     on your site.

 3. Ensure to have well written and unique content on your homepage and all other pages, ensure to avoid duplicated  content. (Ensure to follow the onpage seo checklist below to optimize your internal pages for the search engines)

 4. Make sure to add a sitemap.xml file in your root directory, you can use the Google Webmaster tool to create such a     file. Also ensure to update this file whenever you post new content to the site (only needed for static websites)

5. Ensure that your robot.txt file exists.

6. Add a dofollow link to your sitemap files to your sites navigation.

7. Add dofollowed links to your 404 website pointing to your categories or tag pages, you can also add a search box to make     it easy for visitors to search your site.

 8. Validate your site through the Google Webmaster Tools.

 9. Ensure to avoid long loading times on your site.

 10. Setup Google Analytics or another tracking service and optimize your website on a regular basis.

 11. Update your website on a regular basis. This dosn´t mean you need to post every day or several times per week, once per month can be enough.

 12. Ensure to minimize your websites loading time.

 13. Make sure that your website is xhtml/w3c validated.

 14. If possible, add an rss feed to your website.

 15. Make use of the Google Webmaster tools.

 16. Ensure to use keyword rich urls for your single pages, but avoid urls longer than 80 signs.

Top 10 Social Media Blogs: The 2011 Winners!


The following are the winners of Social Media Examiner’s Top 10 Social Media Blogs for 2011.

  1. Brian Solis: The grand master of social media, Brian is one of the web’s leading social media evangelists and his blog is required reading for businesses.
  2. TopRank: This popular blog, the brainchild of Lee Odden, provides exceptional social media advice and should be one of your daily destinations.
  3. Convince & Convert: Jay Baer’s Convince & Convert provides outstanding content for businesses seeking to embrace social media. This is the second year Jay has made our list.
  4. Six Pixels of Separation: Mitch Joel offers consistent and thought-provoking content delivered with personality.
  5. Social Media Explorer: This blog, from Jason Falls, provides excellent perspective on the current state of social media and should be a regular stop for serious social media marketers. This is the second year Jason has made our list.
  6. Brand Builder: For businesses looking to dive deep into social media discussion, check out Olivier Blanchard’s rich insights. This is the second year Olivier has made our list.
  7. Spin Sucks: Gini Dietrich’s blog takes a look at social media from a PR perspective. Check her site out!
  8. Danny Brown: Danny Brown’s blog examines the human side of social media with rich content and insights.
  9. The Anti-Social Media: For something completely unique, check out Jay Dolan’s satirical blog on the state of social media.
  10. BrandSavant: This unique blog from Tom Webster combines a great intellect with with common sense, giving it an edge.


How to Generate a robots.txt file using the Generate robots.txt tool?

Here you go:


1.On the Webmaster Tools Home page, click the site you want.
2.Under Site configuration, click Crawler access.
3.Click the Generate robots.txt tab.
4.Choose your default robot access. We recommend that you allow all robots, and use the next step to exclude any specific bots you don't want accessing your site. This will help prevent problems with accidentally blocking crucial crawlers from your site.
5.Specify any additional rules. For example, to block Googlebot from all files and directories on your site:1.In the Action list, select Disallow.
2.In the Robot list, click Googlebot.
3.In the Files or directories box, type /.
4.Click Add. The code for your robots.txt file will be automatically generated.

6.Save your robots.txt file by downloading the file or copying the contents to a text file and saving as robots.txt. Save the file to the highest-level directory of your site. The robots.txt file must reside in the root of the domain and must be named "robots.txt". A robots.txt file located in a subdirectory isn't valid, as bots only check for this file in the root of the domain. For instance, http://www.example.com/robots.txt is a valid location, but http://www.example.com/mysite/robots.txt is not.

All you want to know about Robots.txt


Remove your website Searh Engines

If you wish to exclude your website from Search Engine's index, you can place a file at the root of your server called robots.txt. This is the standard protocol that most web crawlers observe for excluding a web server or directory from an index. Please note that Googlebot does not interpret a 401/403 response ("Unauthorized"/"Forbidden") to a robots.txt fetch as a request not to crawl any pages on the site.
To remove your site from search engines and prevent all robots from crawling it in the future, place the following robots.txt file in your server root:
User-agent: *
Disallow: /
To remove your site from Search Engine only and prevent just Googlebot from crawling your site in the future, place the following robots.txt file in your server root:

User-agent: Googlebot
Disallow: /
Each port must have its own robots.txt file. In particular, if you serve content via both http and https, you'll need a separate robots.txt file for each of these protocols. For example, to allow Googlebot to index all http pages but no https pages, you'd use the robots.txt files below.

For your http protocol (http://yourserver.com/robots.txt):
User-agent: *
Allow: /
For the https protocol (
https://yourserver.com/robots.txt):
User-agent: *
Disallow: /


Search Engine will continue to exclude your site or directories from successive crawls if the robots.txt file exists in the web server root. If you do not have access to the root level of your server, you may place a robots.txt file at the same level as the files you want to remove. Doing this and submitting via the automatic URL removal system will cause a temporary, 180 day removal of your site from the Search Engine index, regardless of whether you remove the robots.txt file after processing your request. (Keeping the robots.txt file at the same level would require you to return to the URL removal system every 180 days to reissue the removal.)

Remove your website Searh EnginesRemove part of your website

Option 1: Robots.txt
To remove directories or individual pages of your website, you can place a robots.txt file at the root of your server. For information on how to create a robots.txt file, see the The Robot Exclusion Standard. When creating your robots.txt file, please keep the following in mind: When deciding which pages to crawl on a particular host, Googlebot will obey the first record in the robots.txt file with a User-agent starting with "Googlebot." If no such entry exists, it will obey the first entry with a User-agent of "*". Additionally, Google has introduced increased flexibility to the robots.txt file standard through the use asterisks. Disallow patterns may include "*" to match any sequence of characters, and patterns may end in "$" to indicate the end of a name.
To remove all pages under a particular directory (for example, lemurs), you'd use the following robots.txt entry:
User-agent: Googlebot
Disallow: /lemurs
To remove all files of a specific file type (for example, .gif), you'd use the following robots.txt entry:

User-agent: Googlebot
Disallow: /*.gif$
To remove dynamically generated pages, you'd use this robots.txt entry:

User-agent: Googlebot
Disallow: /*?
Option 2: Meta tags

Another standard, which can be more convenient for page-by-page use, involves adding a <META> tag to an HTML page to tell robots not to index the page. This standard is described at http://www.robotstxt.org/wc/exclusion.html#meta.
To prevent all robots from indexing a page on your site, you'd place the following meta tag into the <HEAD> section of your page:
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
To allow other robots to index the page on your site, preventing only Search Engine's robots from indexing the page, you'd use the following tag:
<META NAME="GOOGLEBOT" CONTENT="NOINDEX, NOFOLLOW">
To allow robots to index the page on your site but instruct them not to follow outgoing links, you'd use the following tag:
<META NAME="ROBOTS" CONTENT="NOFOLLOW">
Note: If you believe your request is urgent and cannot wait until the next time Search Engine crawls your site, use our automatic URL removal system. In order for this automated process to work, the webmaster must first insert the appropriate meta tags into the page's HTML code. Doing this and submitting via the automatic URL removal system will cause a temporary, 180-day removal of these pages from the Search Engine index, regardless of whether you remove the robots.txt file or meta tags after processing your request.

Remove an image from Google's Image Search

To remove an image from Google's image index, add a robots.txt file to the root of the server. (If you can't put it in the server root, you can put it at directory level.)
Example: If you want Google to exclude the dogs.jpg image that appears on your site at www.yoursite.com/images/dogs.jpg, create a page at www.yoursite.com/robots.txt and add the following text:
User-agent: Googlebot-Image
Disallow: /images/dogs.jpg
To remove all the images on your site from our index, place the following robots.txt file in your server root:

User-agent: Googlebot-Image
Disallow: /
This is the standard protocol that most web crawlers observe for excluding a web server or directory from an index. More information on robots.txt is available here:
http://www.robotstxt.org/wc/norobots.html.
Additionally, Google has introduced increased flexibility to the robots.txt file standard through the use asterisks. Disallow patterns may include "*" to match any sequence of characters, and patterns may end in "$" to indicate the end of a name. To remove all files of a specific file type (for example, to include .jpg but not .gif images), you'd use the following robots.txt entry:
User-agent: Googlebot-Image
Disallow: /*.gif$



Sunday, 28 August 2011

Shortcut Code to Track Downloads and all other Non-open Pages



Using the javascript we can track the following file extensions:

doc, eps, jpg, png, svg, xls, ppt, pdf, xls, zip, txt, vsd, vxd, js, css, rar, exe, wma, mov, avi, wmv, mp3

How do you install the tracking JavaScript?

Step 1: Upload the attached file to our website

(Add it to this folder: http://www.yoursite.com/scripts/ )

Step 2: Add the following JavaScript code just before your Google Analytics tracking code:

<script src="http://www.yoursite.com/scripts/gatag.js" type="text/javascript"></script>

So everywhere we need to place this.The file will lok something like this mentioned hereunder:__________________________________________________________________________

if (document.getElementsByTagName) {
        // Initialize external link handlers
        var hrefs = document.getElementsByTagName("a");
        for (var l = 0; l < hrefs.length; l++) {
// try {} catch{} block added by erikvold VKI
try{
               //protocol, host, hostname, port, pathname, search, hash
               if (hrefs[l].protocol == "mailto:") {
                       startListening(hrefs[l],"click",trackMailto);
               } else if (hrefs[l].hostname == location.host) {
                       var path = hrefs[l].pathname + hrefs[l].search;
var isDoc = path.match(/\.(?:doc|eps|jpg|png|svg|xls|ppt|pdf|xls|zip|txt|vsd|vxd|js|css|rar|exe|wma|mov|avi|wmv|mp3)($|\&|\?)/);
                       if (isDoc) {
                               startListening(hrefs[l],"click",trackExternalLinks);
                       }
               } else if (!hrefs[l].href.match(/^javascript:/)) {
                       startListening(hrefs[l],"click",trackExternalLinks);
               }
}
catch(e){
continue;
}
        }
}

function startListening (obj,evnt,func) {
        if (obj.addEventListener) {
                obj.addEventListener(evnt,func,false);
        } else if (obj.attachEvent) {
                obj.attachEvent("on" + evnt,func);
        }
}

function trackMailto (evnt) {
        var href = (evnt.srcElement) ? evnt.srcElement.href : this.href;
        var mailto = "/mailto/" + href.substring(7);
        if (typeof(pageTracker) == "object") pageTracker._trackPageview(mailto);
}

function trackExternalLinks (evnt) {
        var e = (evnt.srcElement) ? evnt.srcElement : this;
        while (e.tagName != "A") {
                e = e.parentNode;
        }
        var lnk = (e.pathname.charAt(0) == "/") ? e.pathname : "/" + e.pathname;
        if (e.search && e.pathname.indexOf(e.search) == -1) lnk += e.search;
        if (e.hostname != location.host) lnk = "/external/" + e.hostname + lnk;
        if (typeof(pageTracker) == "object") pageTracker._trackPageview(lnk);
}



__________________________________________________________________________________

Implementi​ng 301 redirects for an Apache server (non www to www)


Implementing 301 redirects for an Apache server:

Step 1: To implement a 301 redirect the file we need to work with is the .htaccess file. To access the file you need to go into the FTP and look into the document root.

Step 2: If you can’t see it, enable viewing of hidden files since the .htacess file is hidden. If there is still no .htaccess file present , create one with a simple text editor.

Step 3: Insert this code in the file:


RewriteEngine On


RewriteCond %{HTTP_HOST} !^www\.yippeemedia\.com$ [NC]
RewriteRule .? http://www.example.com%{REQUEST_URI} [R=301,L]

Also make sure the Rewrite Engine is turned on, you will just need to turn it on once.

Step 4: Save and Test it!

25 Appealing Call-to-actions for Facebook Pages


























Some Beautiful FB Pages