How do you measure your social media and community activity and results? This article is a collaborative effort to capture key metrics by social media platform and community scenario (or strategy).

You can help. Join the TechNet Wiki member community and revise this topic to reflect your approach and success metrics, enhance existing guidance, or provide additional references (including pointers and descriptions to your relevant content)!


This section contains background information including typical goals of social media and community efforts and a description of common metrics.

Goals of Community and Social Media Efforts

There are many reasons why individuals and organizations engage community. Typical goals include:

  • Understand the Customer (organizational goal). Some organizations want to move from studying indicators of customer thoughts and behaviors (such as surveys) to direct observation (such as the TechNet wiki, or Twitter).
  • Knowledge transfer (individual goal). Some people enjoy sharing something they have learned in anticipation of learning something from someone else.
  • Sense of belonging (individual goal). Even on the job, individuals derive satisfaction from a sense of belonging to a community of peers.
  • Establishing or building reputation (individual, organizational goal). It takes time and care to move from unknown contributor to recognized participant and competent "good citizen" able to provide actionable information within the boundaries of community norms and values.
  • Changing perception of the organization (organizational goal). People don't want to hear from companies, they want to work with individuals.

Common Metrics

Many of the metrics used for web, social media, marketing, online publishing and other online efforts have roots in established traditions. Newspapers and print media is interested in metrics such as total views (broken down a number of ways), unique views, popularity, value, brand, trust and impact. These were used to determine advertising rates.  Electronic bulletin board system operators focused on number of callers, duration, participation, and upload/download activity. These metrics may be "old school", but they form the foundation of social media and community success measuring efforts today.

The following metrics are common across multiple platforms and scenarios. Many derive meaning only when backed by business and team goals as part of a larger scenario. Not all metrics apply to every platform.  

  • Page View. From Wikipedia, a page view is "a request to load a single HTML file ('page') of an Internet site... This should be contrasted with a hit, which refers to a request for any file from a web server. There may therefore be many hits per 'page' view since an HTML 'page' can be made up of multiple files." Page views are the "currency of the realm", the metric that all websites track even if it is relatively meaningless as an indicator of community on its own.
  • Unique Visitor. This is usually seen as a metric measuring "reach". Websites with more reach are thought to command more page views, which drive advertising revenue. For some goals, "depth" or "engagement" are more important than "reach." For more information, see Unique Visitor (Wikipedia).
  • Bounce Rate. The percentage of visits that involve landing and then exiting only a single page on the site. If the site structure has multiple levels, it is important to understand if users are making it to the intended level. 
  • Search Rank. The position of a web page in the search results of major search engines implies quality based on the prestige of the search engine and its algorithms. Search engine optimization efforts can affect this result (even for questionable content); see SEO on Wikipedia for more information.
  • Referrer. Information about where the visitor arrived from -- a search engine, blog, or "nowhere" (address was typed in or selected via favorites or other mechanism). This metric is usually shown in terms of the referring domain (e.g. and the page (a blog post).
  • Keywords. Referrals from search engines include keywords used to narrow down content. Knowing which terms are used most often can help gauge audience, content relevance, and other elements. Knowing search terms that are not matched with content can be important for improving content discovery.
  • Contribution trend. Increases in contributions over time indicate the softer measures of comfort and engagement.


Different people and organization have different reasons to gather metrics. You may need to provide these to a potential advertiser, to a potential investor in your site, or you might just want to gather them and see what happens to them when you try out different approaches to your site or social platform. While metrics are useful in these and other ways, they represent an inexact science at best, and can be totally misleading at worse. You should be able to easily recognize if your efforts are successful by looking at the totality of your metrics, non-quantitative observations, and direct input from members of your community. In most cases, you'll be able to easily recognize whether what you're doing is working or not without getting bogged down in measurements.


Social media tools have basic functionality that tend to dictate a core set of metrics. For example, the metrics number of connections. number of influential connections, and recommendations make sense for social networking sites like Linked-In that make it easy to make and sustain professional connections. These metrics can be important on their own but are generally part of a larger multi-platform and community effort (or scenario; see Scenarios below).


Blogs provide a publishing environment for a wide variety of content. Articles, videos, files, code snippets and other types of content can be hosted on a blog. Various metrics that can be considered when measuring the performance of a blog might include:

  • Posts overall or posts by topic or category.
  • Comments provided in response to a blog post.
  • Responses to comments posted to the blog per month.
  • Views or number of "hits" to the pages on the blog.
  • Unique visitors to the pages on the blog.
  • Search engine hits on Bing/Google for the blog author's name and primary technology (as a way to measure influence, reach and reputation).
  • RSS hits for blog content.
  • Comment Types. For example, the number of blog comment participants who were selected as MVPs in that technology, became customer advocates, volunteer to contribute articles to the blog or volunteer to partner on content in the TechNet Wiki, forum, or other venue.
  • Average rating of articles on the blog by the community.
  • References to the blog post by thought leaders in the area that the blog covers
  • Trends for each of the above metrics including weekly, monthly, quarterly and yearly.

In addition to the raw numbers obtained for the above measurements, you might also want to consider "weighted values". The weighted values (which we might call "ratings") enable you to compare performance based on the popularity of the product or technology on which the blog is focused.

For example - suppose that Microsoft Exchange has 1000 customers (of course this is wrong, but we're just putting up a number of our example) and the Unified Access Gateway has 10 customers. We could then conjecture that Exchange is approximately 100X more popular than UAG and therefore we could multiply our raw numbers (not derivative values like M/M percentage increases or decreases) by 100 to see how the blog hosting content on UAG would perform if the product were as popular as Exchange. The weighting values would be potentially higher for topics that reach across a large number of technologies, such as scripting.


Wikis enable collaborative authoring of articles and longer chunks of information. For a quick review, visit Wiki, a comprehensive article about Wikis, on Wikipedia.

Metrics for Wikis include:

  • Total number of articles in the Wiki. This can measure traction, contributor adoption, community health, an aspect of engagement, participation and platform success (when success is measured by number of articles).
  • New articles. Helpful when interested in net new content, community subject/content interest, participation, engagement, and general success (when measured by number of new articles per time period).
  • Deleted articles. Useful as a measure of undesirable activity (poor content) and desirable activity (healthy moderation). 
  • Word count hybrid as (article word count / days article is published) * article page views. This is a wiki-specific measure of "reach."
  • Mentions including tweets, bookmarks, prints, favorites, likes, and other types of social media references. This is meant to measure "virality" or "buzz", meaning that users recommend it to others. This is sometimes seen as endorsement.
  • Search rank. This is meant to measure how easy it is for the content to be discovered (its discoverability).
  • Non-text revisions including video, sample code, screenshots, diagrams, and images. This is meant to measure technical content quality, on the hypothesis that technical content that includes things like diagrams and sample code is generally said to be "more technical" than comparable content that does not include these elements.
  • Contributor activity including number of new contributors and the ratio of contributors to profile owners (overall participation rate).
  • Member activity. Number of new profiles created, time to first contributions, proportion of activity in Wiki versus forums or other venues (if profile captures such information).
  • Revisions as (number of edits that result in >20% change in word count * number of unique editors). This focuses on quality improvement as a function of edits and editor diversity. Although correction of a spelling error adds value, it would not count as part of this metric.


This section explores metrics by scenarios of varying complexity. Please share your scenarios and metrics; this article will only improve if you contribute!