Jun 122015

My general approach to extracting data from any API is to extract the data into a relational database and then write SQL queries on top of it to get the required information. Though this works, its often tedious and involves running multiple applications.

The R programming language works great for statistical computation and plotting graphics and I have been tinkering around with it for the last few weeks. While learning R, I thought of using R to extract data from the API as well. This would allow extracting the latest data from the API and compute stats with a single script. And though the XML package in R doesn’t make for the most intuitive parsing code, the vectorized operations reduces the need for frequent loops and keeps the code concise and readable.

And though this code is written for the Socialcast API, it can be easily tweaked to pull data from any social API like Facebook, Yammer etc. The first step is to pull the data from the API – the RCurl package gets us the data which can then be parsed using the XML package.


page = 1
finalDataFrame <- NULL

getInnerText <- function(inputData,parentNode,childNode) {
  test <- xpathSApply(inputData,parentNode,function(x){
    if(is.null(x[childNode][[childNode]])) {
    }else {

while(continueLoading) {

  messagesData <- getURL(paste("https://demo.socialcast.com/api/groups/acmecorpsoftballteam/messages.xml?per_page=20&page=",page,sep=""),
                         userpwd="emily@socialcast.com:demo", ssl.verifypeer = FALSE, httpauth = 1L)
  print(paste("LOADING PAGE:",page))
  data <- xmlParse(messagesData)
  totalMessages <- length(getNodeSet(data,"//messages/message"))

The totalMessages property is to check the number of messages returned by the API. When it’s zero, the while loop is exited, else the execution continues. The xmlParse function gives us a in memory structure of the document which can be iterated upon. we use the sapply function which applies a function to each element of a list and returns a vector. I’ll come to the getUserNodeValue function later

if (totalMessages == 0){
    continueLoading = FALSE
  else {
    tempDataFrame <- data.frame(
      InteractionType = "Message",
      ID = sapply(getNodeSet(data, "//messages/message/id"),xmlValue),
      Author = sapply(getNodeSet(data,"//messages/message/user/name"),xmlValue),
      Body = sapply(getNodeSet(data,"//messages/message/body"),xmlValue),
      Url = sapply(getNodeSet(data,"//messages/message/permalink-url"),xmlValue),
      Type = sapply(getNodeSet(data,"//messages/message/message-type"),xmlValue),
      CreatedAt = sapply(getNodeSet(data,"//messages/message/created-at"),xmlValue),
      Location = sapply(getNodeSet(data,"//messages/message/user/id"),function(x){getUserNodeValue(x,"Location")}),
      Country = sapply(getNodeSet(data,"//messages/message/user/id"),function(x){getUserNodeValue(x,"Country")}),
      Sector = sapply(getNodeSet(data,"//messages/message/user/id"),function(x){getUserNodeValue(x,"Sector")}),
      Title = sapply(getNodeSet(data,"//messages/message/user/id"),function(x){getUserNodeValue(x,"Title")}),
      Department = sapply(getNodeSet(data,"//messages/message/user/id"),function(x){getUserNodeValue(x,"Department")})

    if (is.null(finalDataFrame)) {
      finalDataFrame <- tempDataFrame
      finalDataFrame <- rbind(finalDataFrame,tempDataFrame)

Now we have a data frame with all the Messages from the API. However, we also need the comments and likes. This is the only place where I needed to use a for loop to iterate through each individual message node and select their comments. The xpathSApply function reduces our code further by being able to query each node of the NodeSet with the given XPath expression and applying a function on it. Furthermore it returns a vector which fits in nicely into our existing data frame.

   for( i in 1:length(getNodeSet(data,"//messages/message"))) {
      if(length(getNodeSet(data,paste("//messages/message[position()=",i,"]/comments/comment"))) > 0){

        allComments <- getNodeSet(data,paste("//messages/message[position()=",i,"]/comments"))[[1]]


        commentFrame <-  data.frame(
          InteractionType = "Comment",
          ID = xpathSApply(allComments,"comment/id",xmlValue),
          Author = xpathSApply(allComments,"comment/user/name",xmlValue),
          Body = xpathSApply(allComments,"comment/text",xmlValue),
          Url = xpathSApply(allComments,"comment/permalink-url",xmlValue),
          Type = "",
          CreatedAt = xpathSApply(allComments,"comment/created-at",xmlValue),
          Location = xpathSApply(allComments,"comment/user/id",function(x){getUserNodeValue(x,"Location")}),
          Country = xpathSApply(allComments,"comment/user/id",function(x){getUserNodeValue(x,"Country")}),
          Sector = xpathSApply(allComments,"comment/user/id",function(x){getUserNodeValue(x,"Sector")}),
          Title = xpathSApply(allComments,"comment/user/id",function(x){getUserNodeValue(x,"Title")}),
          Department = xpathSApply(allComments,"comment/user/id",function(x){getUserNodeValue(x,"Department")})

        finalDataFrame <- rbind(finalDataFrame,commentFrame)

      if(length(getNodeSet(data,paste("//messages/message[position()=",i,"]/likes/like"))) > 0){

        allLikes <- getNodeSet(data,paste("//messages/message[position()=",i,"]/likes"))[[1]]

        likeFrame <-  data.frame(
          InteractionType = "Like",
          ID = xpathSApply(allLikes,"like/id",xmlValue),
          Author = xpathSApply(allLikes,"like/user/name",xmlValue),
          Body = "",
          Url = "",
          Type ="",
          CreatedAt = xpathSApply(allLikes,"like/created-at",xmlValue),
          Location = xpathSApply(allLikes,"like/user/id",function(x){getUserNodeValue(x,"Location")}),
          Country = xpathSApply(allLikes,"like/user/id",function(x){getUserNodeValue(x,"Country")}),
          Sector = xpathSApply(allLikes,"like/user/id",function(x){getUserNodeValue(x,"Sector")}),
          Title = xpathSApply(allLikes,"like/user/id",function(x){getUserNodeValue(x,"Title")}),
          Department = xpathSApply(allLikes,"like/user/id",function(x){getUserNodeValue(x,"Department")})

        finalDataFrame <- rbind(finalDataFrame,likeFrame)

  page <- page + 1



Now we come to the getNodeUserValue function. This is simply a performance optimization since calling the API to get the user details each time becomes very time consuming. So I generally keep the user data in a database and use the id in the xml response to query the data frame and fetch the correct user record. This step however is purely optional and you could easily call the api to get each user’s response and parse it.

getUserNodeValue <- function(inputNode,queryNode){
  if (nrow(users[users$ID == xmlValue(inputNode),]) == 0)
    users[users$ID == xmlValue(inputNode),][[queryNode]]

At this point we have all the API information parsed into a data frame (finalDataFrame). Now for the fun part! Though you can subset and count easily using the built in language functions, a package called dplyr makes this code more readable and intuitive. With dplyr you can perform multiple data manipulation operations like filter, select, order, group by etc and chain them together to get the final result

So to group the data frame by a column and count, the code is as simple as

#############Type of Activity in the Group#####################################

interactionType <- group_by(finalDataFrame,InteractionType) %>%
                   summarise(count = n())



#############Active Day of Week#############################

activeDay <- group_by(finalDataFrame,weekdays(as.Date(CreatedAt))) %>%
 summarise(count = n()) %>%


The top 5 users by total Activity

activeUsers <- group_by(finalDataFrame,Author) %>%
 summarize(TotalActivity=n()) %>%
 arrange(-TotalActivity) %>%


The Type of Messages being created

messageTypes <- filter(finalDataFrame,InteractionType == "Message") %>%
 group_by(Type) %>%
 summarize(count = n()) %>%


The stats shown here barely scratch the surface of what R is capable of computing.

Jul 072012

I installed WordPress at work and have been trying to make a multisite installation work as an enterprise blogging platform. During one of the discussions around it, a colleague asked me if it was possible to integrate the blog comments system with Socialcast (the microblogging tool which is quite popular at work). I initially thought this would not be straightforward and would require development of a custom plugin from scratch.

However later, I did some searching and found that Socialcast already provides an infrastructure called Reach which can be used to integrate the comments, posts and trends with a variety of 3rd party sites. For an organization, this integration is extremely valuable as it introduces a common store for all the social interactions – be it Sharepoint, blogs, intranet pages or anything else. Since Reach is written in Javascript, it doesnt pose any restrictions on server side technology used for the sites.

So the primary goal was to make Reach work with WordPress. Initially I looked at options like HTML Javascript Adder which lets you add the reach code directly into a widget on the site. However, this posed too many issues given the lack of control one had on when the scripts were getting loaded and the difficulty to configure it. Since all Reach scripts look exactly the same except a token which is generated by Socialcast when the script is created in the admin panel, it is useless to keep replicating the same code everywhere.

Then I came across a plugin written by Monica Wilkinson of Socialcast. However this was last updated an year ago and both WordPress and Socialcast required some changes. So I forked the branch and made a few minor tweaks to suit my requirement. The plugin gives an option page to configure the tokens and the URL of your socialcast community. So I added the php file to my plugins directory and Network Activated it (This was a multisite installation). Once this is done you would get an options page on the dashboard

Now the options page has the tokens that need to be entered along with the url of your socialcast community.Remember to be careful while sharing the tokens as they have the options of allowing access to the community without the proper login credentials. I am not sure if Socialcast provides the option of revoking these tokens on a periodic basis and providing fresh ones, but this should be present to protect the company data.

There are four main kinds of reach extensions

  • Discussion – A comments system which would be shared with the Socialcast community
  • Stream – Any group or company stream
  • Trends – Trending topics, people etc.
  • Button – Like, Recommend, Share buttons. The exact verb can be configured on the admin screen.

All of these are rendered in the exact same way, by calling a script asynchronously (services/reach/extension.js) and then pushing the reach object with the javascript token. In the plugin there is a get_div function which generates the html tag

function get_div($id, $style, $token) {
	$socialcast_url = get_option('sc_host');
	if ($id != '' && $token != '') {
		return '<div id="' . $id . '" style="' . $style .
		'"></div><script type="text/javascript">_reach.push({container: "' . $id . '", domain: "https://'
		. $socialcast_url . '", token: "' . $token . '"});</script>';
	} else {
		return '';

There are two main ways of rendering the appropriate Reach control on your page

  • call the function in PHP code
  • Use the shortcode [reach]

Lets see the first option. The appropriate function that needs to be called in PHP code is given on the option screen. Let’s say I want to display the button just below the title. So I go to the theme’s postheader.php file and call the sc_add_button function in PHP code. Note: the call to sc_add_button function will only work if you have the button token configured in the plugin options. This step may differ from theme to theme.

<header class='post-header title-container fix'>
	<div class="title">
		<<?php echo $header_tag;?> class="posttitle"><?php echo suffusion_get_post_title_and_link(); ?></<?php echo $header_tag;?>>
	 echo sc_add_button('width:300px;height:30px'); 
 			if ($post_meta_position == 'corners') {
	<div class="postdata fix">

Or If you want the comments system to be replaced by the Socialcast discussion system, then go to the comment-template.php file in the wp_include directory and replace the comment markup with the call to sc_add_discussion(). Remember that you can pass the style to this method so it overrides the default styles in the plugin.

		<?php if ( comments_open( $post_id ) ) : ?>
			<?php do_action( 'comment_form_before' ); ?>
			<?php echo sc_add_discussion(''); ?>
			<?php do_action( 'comment_form_after' ); ?>
		<?php else : ?>
			<?php do_action( 'comment_form_comments_closed' ); ?>
		<?php endif; ?>

The resulting page looks like this

Now for the shortcode way. This plugin initially required the user to enter the token in the shortcode functions but I wasnt too happy with that way as revealing the tokens to non admin users seems risky. So I wrote a new function in the plugin which would allow the user to give a text on what should be displayed and the token would be read from the options. The previous way of specifying a token still exists as well.

function get_shortcode_div($id,$style,$token,$display){
 $tokenInOptions ='';
  if($display != '')
			case 'button':
			case 'discussion':
				$tokenInOptions= get_option('sc_discussion_token');
			case 'profile':
			    $tokenInOptions = get_option('sc_profile_token');
		if($tokenInOptions != '')
		   return get_div($id,$style,$tokenInOptions);
  return get_div($id,$style,$token);

This system makes it really easy for users to add the button to their blogs. Just insert a text based widget in the sidebar with the shortcode reach and it will render the widget when the page is run. Just make sure the theme calls the do_shortcode function on the widget_text parameter. If not a single line addition should do it.

Once the widget is saved, the reach extension is rendered on the page. The modified plugin can be downloaded here and the original version written by Monica can be downloaded here. I will make changes for the trends extension soon once I get the token to test it out.

Dec 272010

Since the last post, I changed the library/API wrapper a bit. I removed all the ugly reflection stuff to retrieve the specific API urls and substituted them with static variables in a separate class. However this does have the added disadvantage that the urls are exposed to the client, but at least it wont break any client code if Socialcast decides to change the API in the future. Also in the previous example, the username, password and subdomain are variables in the wrapper itself. In the absence of oAuth, every call needs to be authenticated with the user credentials. To avoid having to handle the responsibility of storing user information, I created a class to encapsulate this information (SocialcastAuthDetails) which is passed to the API Accessor for every call. I also added the data objects to return strongly typed responses from the API accessor instead of an XmlDocument, but havent gotten around to incorporate them yet.

Here is the code to Post a message and Get the company stream. Accessing the Company Stream requires two calls – first to get the Stream ID and the next to get the messages for the particular stream.

        public XmlDocument GetCompanyStream(SocialCastAuthDetails auth)
            XmlDocument streams = new XmlDocument();
            if (companyStreamID == 0)
                streams.LoadXml(base.MakeServiceCalls(helper.GetSocialcastURL(ObjectType.Streams,auth.DomainName,null), GetCredentials(auth.Username,auth.Password)));

                foreach (XmlNode node in streams.GetElementsByTagName("stream"))
                    if (node.SelectSingleNode("name").InnerText.ToLower() == "company stream")
                        companyStreamID = int.Parse(node.SelectSingleNode("id").InnerText);
            streams = new XmlDocument();
            return streams;

        public XmlDocument PostMessage(string title,string body,SocialCastAuthDetails auth)
            string data = String.Format("message[title]={0}&message[body]={1}", HttpUtility.UrlEncode(title), HttpUtility.UrlEncode(body));
            XmlDocument update = new XmlDocument();
                                helper.GetSocialcastURL(ObjectType.Messages, auth.DomainName, null),
                                GetCredentials(auth.Username, auth.Password), data));
            return update;

Since any messages which manipulates data requires a POST call instead of a GET, the WebServiceHelper class needs a new method to make the service call using POST. Also the data which is to be posted is URL encoded before being sent to this method.

  protected string MakeServiceCallsPOST(string _requestURL, NetworkCredential credentials, string data)
            // Create the web request
            HttpWebRequest request = WebRequest.Create(_requestURL) as HttpWebRequest;

            request.Credentials = credentials;
            request.ContentType = "application/x-www-form-urlencoded";
            request.Method = "POST";

            byte[] bytes = Encoding.UTF8.GetBytes(data);

            request.ContentLength = bytes.Length;
            using (Stream requestStream = request.GetRequestStream())
                requestStream.Write(bytes, 0, bytes.Length);

                using (WebResponse response = request.GetResponse())
                    using (StreamReader reader = new StreamReader(response.GetResponseStream()))
                        return reader.ReadToEnd();

This is the client code to post the message. The socialcast auth details class is initialized by the client and sent, so its their headache to maintain passwords and other sensitive information.

    class Program
        static SocialCastAuthDetails auth = new SocialCastAuthDetails()
            DomainName = "demo",
            Username = "emily@socialcast.com",
            Password = "demo"
        static void Main(string[] args)
            int _messageCounter=1;
            APIAccessor api = new APIAccessor();
            api.PostMessage("Posting from API", "this is a test message posted through C#", auth);
            var xdoc = api.GetCompanyStream(auth);
            Console.WriteLine("Company Steam of demo.socialcast.com");
            foreach(XmlNode node in xdoc.GetElementsByTagName("message"))
                Console.WriteLine("Message {0} posted by {1}", _messageCounter++, node.SelectSingleNode("user/name").InnerText);
                Console.WriteLine("Message: {0} {1}", node.SelectSingleNode("title").InnerText, node.SelectSingleNode("body").InnerText);

It works!!

Dec 252010

Socialcast is one of the better enterprise microblogging tools out there. I have been trying to use its API to understand better how people use microblogging in the enterprise. There is no better way to validate (or invalidate) set hypotheses than by actually mining data and identifying patterns in them. If sufficient data exists which is spread across a larger time period, fascinating patterns emerge. In this older post, I had correlated the length of each blog post (in my previous organization’s internal blogging platform) with the number of comments it received. After a senior colleague helped me make sense out of the data, a clear conclusion was that the shorter a blog post is, the more likely people will comment on it.

To attempt something similar with Socialcast, I finally got around to using their API through C# after procrastinating for a very long time. I didnt map the API responses/arguments to .NET objects yet, just wrote a few classes to make it easier to make different service calls without repeating code. In the below code, I used the API to return user details of some of the users. Similarly there are different calls to get groups, streams, followers etc. All the information which is gotten from the site needs a GET call, anything where information is changed (commenting, posting a message etc.) has to be done via POST.

Every socialcast site has a different subdomain (e.g. demo.socialcast.com) and a username/password to authorize requests. Since oAuth is not yet available, this information needs to be stored in the application itself. I saved it in a class for now but a better way would be to store it in a config file (and encrypt it for good measure). The SocialCastData class has all the client specific details like api urls, usernames , password etc. All these properties are protected and only the Helper class which inherits from the data class can access it. The helper class provides the API URL and credentials to the APIAccessor class.

    class SocialCastData
        /// <summary>
        /// These are the private variables which are configured
        /// as per your socialcast site
        /// </summary>
        string domainName = "demo";
        string skeletonURL = "https://{0}.socialcast.com";
        string userName = "emily@socialcast.com";
        string passWord = "demo";
        string _usersURL = "/api/users.xml";

        //Protected properties to give access to the username/password and API
        //URL only to the helper class which inherits from this data classes
        protected string UserName { get { return userName; } }
        protected string Password { get { return passWord; } }
        protected string usersURL { get { return _usersURL; } }

        //Get the basic URL of the site, without any API call
        protected string GetSocialCastURL()
            return String.Format(skeletonURL, domainName);

        //This method uses reflection to provide the API url
        // value based on an agument to this method
        protected string GetSocialCastURL(string apiFilter)
                PropertyInfo _allProperties = this.GetType().GetProperty(apiFilter + "URL", BindingFlags.NonPublic | BindingFlags.Instance);
                if (_allProperties == null)
                    throw new Exception("There was no corresponding API URL found");
                    string value = _allProperties.GetValue(this, null).ToString();
                    return GetSocialCastURL() + value;
            catch (Exception _eObj) { throw new Exception("There was no corresponding API URL found", _eObj); }


    /// <summary>
    /// This is the helper class which provides the URL
    /// and Credentials to the WebServiceHelper object. Only the Helper
    /// class has access to the SocialCastData members since all its
    /// members are protected.
    /// </summary>
    class SocialcastHelper:SocialCastData

        //use the base class data to get the Credentials object.
        public NetworkCredential GetCredentials()
            return new NetworkCredential(base.UserName, base.Password);

        //Get the URL for the socialcast website. The overloaded methods are for
        //returning the appropriate URL depending on the function and if any additional
        //query parameters are to be appended to the URL.
        public string GetSocialcastURL()
            return base.GetSocialCastURL();

        public string GetSocialcastURL(string _apiURL)
            return base.GetSocialCastURL(_apiURL);

        public string GetSocialcastURL(string _apiURL, List<KeyValuePair<string, string>> _paramMessages)
            //Get the URL from the base class method and append
            //the query params
            string _url = base.GetSocialCastURL(_apiURL);
            if (_paramMessages.Count > 0)
                _url += "?";
                foreach (var item in _paramMessages)
                    //appending each param key and value
                    _url += String.Format("{0}={1}&", item.Key, item.Value);
                //String the last ampersand sign
                return _url.Substring(0, _url.Length - 1);
            return _url;

The APIAccessor is the class which contains Business logic functions like GetUsers or GetMessages etc. It sends in its parameters as a List of Keyvalue pairs to the helper class which constructs the actual REST call URL out of the parameters.

 class WebServiceHelper
        public string MakeServiceCalls(string _requestURL, NetworkCredential credential)

            // Create the web request
            HttpWebRequest request = WebRequest.Create(_requestURL) as HttpWebRequest;
            request.Credentials = credential;
            // Get response
            using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
                // Get the response stream
                StreamReader reader = new StreamReader(response.GetResponseStream());

                // Console application output
                return reader.ReadToEnd();
    class APIAccessor:WebServiceHelper
        //Creating the Helper class Instance
        SocialcastHelper helper = new SocialcastHelper();

        public XmlDocument GetUsers(string numberOfusers,string page)
            XmlDocument Users = new XmlDocument();
            var serviceParams = new List<KeyValuePair<string,string>>();
            serviceParams.Add(new KeyValuePair<string,string>("per_page",numberOfusers));
            serviceParams.Add(new KeyValuePair<string,string>("page",page));

            Users.LoadXml(base.MakeServiceCalls(helper.GetSocialcastURL("users", serviceParams), helper.GetCredentials()));
            return Users;

The test code is below to display the name of users from the socialcast demo site.

 class Program
        static void Main(string[] args)
            var _xDoc = new APIAccessor().GetUsers("30", "1");
            foreach (XmlNode item in _xDoc.GetElementsByTagName("user"))
                XmlNode name = item.SelectSingleNode("name");


The Sample code can be downloaded here.

Dec 172010

In the early years of the internet when it was still in its infancy and yet to capture the imagination of a significant section of the society, it was mostly used as a one way content distribution system. Websites were static and their content created by a small team so a larger group of users could access, but not modify it. However over time, the Internet evolved and slowly democratized itself. The content creation was passed more and more to users, and buzzwords like Social media, blogging, tweeting were born. Web 2.0 had finally arrived, and how!!

Wikipedia, Facebook, Twitter, LinkedIn, WordPress were some of the great new tools which empowered users to eliminate geographical boundaries and connect in ways hitherto unimaginable. Blogging really took off in the first half of this decade. Suddenly, the common man had a voice, he could express dissent, criticism or his views of the world today. Previously what was restricted to a lazy chat between friends could be put forth to the whole world and this essentially changed media as we know it. But blogging hit a roadblock few years back, even though the number of blogs was large, it still weren’t quite the hit that Social media exponents were hoping for. One reason for that could have been the significant amount of time which needed to be invested to write a decent blog post. One needed to jot down his thoughts, organize them, articulate it nicely and edit it well enough to make the post crisp and enjoyable to the reader. The effort needed to do all this made regular blogging out of reach for many web users. Change however, was just around the corner.

On a fateful day in 2006, Twitter was launched. Initially called the SMS of the Internet, Twitter’s 140 character limit became a empowering tool rather than an impediment to content creation. Blogging Microblogging was far more easier and the medium exploded. From 400,000 tweets per quarter in 2007, it reached 50 million tweets per day in February this  year. Given Twitter’s unprecedented success, it was inevitable that the enterprise looked at it to be used as a collaboration tool on the Intranet. And thus, Enterprise Microblogging was born. I had the opportunity of being a part of two such social networks in my current and previous organizations (powered by Yammer and Socialcast) from their infancy to  relative maturity.  I’m no social media expert but I jotted down some of my random observations from a user’s perspective.

Growth Cycle:-

Like many software products and technologies, a user’s perception of Enterprise Microblogging also follows the hype cycle.

Most new users sign up to Microblogging via an invite from their colleagues. Those who dont have much experience in using twitter are usually bewildered by this new tool. Expectations are high thanks to that over-enthusiastic colleague who promised the world for signing up. But disillusionment quickly sets in, for many users there is too much of wildly varying content to make any sense out of. They send out a couple of messages, maybe like a few posts but thats just about it.  Not many users are patient enough to ride out the disillusionment phase to actually start deriving value out of the system. The user retention is quite low in enterprise networks and according to my rough estimates only about 30-40% users are active in the system.

The Champion

Though social networks are touted to grow horizontally without the need of any central direction, I have observed the opposite in Enterprise Microblogging. People adapt to the network much faster if there is someone who quickly welcomes new users, understands their interests and introduces them to like minded people, routes queries to individuals/groups who can answer them faster. Maybe in a mature network, this function could be taken over by a group of power users. But those that are still growing, require an individual who can be the champion. In my previous organization I observed the quality of the network content going down considerably when the champion scaled down his participation and no one else took over.

Signal to Noise ratio

Image courtesy http://www.socialresearchmethods.net

The public timeline/ Home Stream is the holy grail of user experience in any social network. Facebook took this serious enough to patent its NewsFeed feature to decide which friends’ updates appear on your news feed.  If our close friends updates appear on the news feed, then the chance of responding becomes much higher. In the absence of a personal connection on microblogging sites, this translates into users whose updates would be interesting to us. It could be determined by similarity in content, common connections etc. Nothing would confuse a new user more than completely random content which is not in the least bit appealing.

Groups, Communities and Networks

Continuing with my previous point about the Signal to Noise ratio, the content in the Home Stream is often too amorphous to be of interest to a newbie. Here is where Groups or Communities help in the Microblogging network. When like minded people come together, the content they create is more likely to be homogeneous and easier for a person to relate too. In Yammer I saw that some groups had more posts and replies than the general stream itself. Socialcast also has a great feature which allows users to tag the post to a group in the comment even if the original author misses to do so. This is where power users and the Champion can contribute to the network by making sure all content is properly categorized according to the groups in which it belongs.

Will Microblogging kill email?

Microblogging has as much chance of killing email as Google docs has of replacing MS Office. Regardless of the merits or demerits of the two platforms, the adoption of one at the cost of the other requires a fundamental shift in the thinking of the users – not too dissimilar from the time when typewriters gave way to computers. Whether users will be willing to make that quantum leap and ditch email entirely, only time will tell. However it is not deniable that Microblogging would eventually reduce the amount of email being sent in the organization. Organization wide updates which are cascaded to a large group of users would be far easier through social media than conventional email.

Knowledge Management Tool?

Knowledge Management in today’s organizations are not restricted to churning out thousands of documents and then taking pride in number of documents we have in our knowledge repository. To be of any real value, the more useful resources must be separated from the ordinary Copy-paste jobs from the internet. Microblogging provides us a measurable mechanism to gauge the response of the community towards any particular content posted in the form of the replies / likes it receives. Documents which elicit larger response are more likely to be of use. Also since its all text based, all of the content is indexable and searchable from various other point of access like Intranet Portals, Sharepoint sites etc.

Democratizing the Organization

Even though almost all the higher management if any company works with an open-door policy, in practice the direct communication between employees engaged in actual operations and the management team is quite rare, and mostly in scenarios with set agenda and little room for a free discussion. Microblogging gives every employee the chance to put forth his direct questions to the management team. Frank opinions and suggestions can be given on the strategy of the company.

Tagging and Trending

This one’s a no-brainer. Hashtags allow us to categorize information according to topic and makes it easily searchable later. Trends give us a picture of the so called hot topics being discussed at the very moment. However though trends are suited for a large user base like Twitter, microblogging networks have far less users and this feature isn’t used that much.