Wednesday, October 12, 2016

Posters, presentations and speed geeking: finding out what we know

The seven openings to events that we described in our last blog are a first step in 'bringing people’s voices and their different experiences into the room, in a spirit of curiosity and learning'. We used that phrasing when we first blogged about our Facilitation Practice last year to describe what happens as you move from openings to a logical next phase in gatherings: 'finding out what we know'. The Tagging and Human Spectrogram exercises we described in our last blog get people curious and interested, and lead naturally into richer conversations in which people find out about each other.

What shall we do about Presentations?

Presentations have a bad press among a lot of development people exhausted by the round of conferences and workshops and generally also among facilitators. The issue here is one of framing and organisation:"what if we re-imagine conferences and meetings as gatherings where people can connect, learn and have the conversations that really matter", was a blog response here to Duncan Green's rant about awful events. We described a simple but effective approach to sharing, a variant on speed-networking. Three-minute snapshot presentations from people of what was inspiring about their work meant that in less than an hour everyone knew the best of what was happening across a range of projects.

Presentations become engaging and energising when people are limited to a fixed time or number of slides, or by using a timer approach like Pecha Kucha. This also offers a compromise for those who value the security or ease of powerpoint. When there is a lot of detail to present, doing it this way allows for different approaches to communication and learning. For example, in a recent annual meeting of the CARIAA program, which involves four large, complex research syndicates in detailed and current climate change research, each syndicate gave a 10-minute introductory presentation very early in the three-day event. A bit like a TED talk, it meant that each of the senior scientists and their teams produced rich, engaging and dynamic communication that set the scene and sparked off a range of questions and follow-up conversations.

Posters and Galleries

In both those events the presentations were followed by a ‘market place’, with posters and other information for more in-depth discussions. The Building Demand for Sanitation (BDS) program, on the other hand, started with the posters.

Nairobi 13 BDS convening gallery walk.JPG
Prof. Bilqis Hoque talks about women leaders in local Government at a BDS convening
The way it's organised and managed has evolved over the five years of the program, as is explained in our original post on the FacilitationAnywhere blog, where we also discuss other methods such as Speed Geeking and Carousels

Wednesday, October 05, 2016

Seven Openings to facilitated events

Almost everyone’s arrived. Some are already sitting down, others are standing around and chatting. A couple of people are late, but time-keeping matters. It’s time to get started.

Suddenly you find yourself in front of a group of people – 15, or 33, or 65, or 128 of them, or more, most of whom don’t know you. You’re the facilitator and the people in the room are putting their trust in you to help them achieve something concrete by the end of the event. You want to seize the moment so that the participants come into the physical and mental space for the gathering as quickly and smoothly as possible. Then you can make a start on doing what needs to be done, letting the locus of control move between you and them.

Openings are about coming fully into the present and connecting with self, others and the purpose for the gathering. They enable people to ‘arrive’ in body and mind, relax into what’s happening, ready to engage with the work to be done. People need to be able to meet each other as quickly and easily as possible, to form as a group and create the ground for collaboration. Each group is new, formed in that time and place, meeting for a specific reason, and shaping its own particular identity.

In our recent blog on the FacilitationAnywhere site we describe seven ways to open, engage and connect...

Tuesday, August 30, 2016

How are development organizations using Google Analytics?

In the previous post, I described the importance of having a digital analytics measurement plan and I presented some essential elements for correct and efficient use of Google Analytics (GA). However, recent work I’ve conducted make me wonder how advanced (or not) the use of digital analytics - and GA specifically - is amongst development organizations. My recent experience was limited to 6 organizations (different in size, resources and capacities) so the sample is clearly limited. But some trends are probably more common than not.

Google Analytics

Google Analytics is not always used to its full potential 

In reviewing how different websites use GA, I discovered huge differences. For some organizations, the setup of Analytics is far from optimal. For example, one organization didn’t have a great understanding of the differences between GA account, profile and property, which resulted in unstructured proliferation of accounts.

The use of different reporting views, as well as filters and advanced segments is also not very common. This means that Analytics data are just analysed in aggregate, without telling you much about the specific audience you intend to reach. For example, if your website is targeting users in North Africa and the Middle East, you need to be able and single out traffic from these regions, to better analyze your target audience.

Tracking goals and ‘conversions’ is not always common practice. Goals can be set up in various ways in GA to track users’ interaction with the website - when they scroll on the page, click a link, decide to print a page, comment or spend a certain amount of time on the page. This can provide a great deal of information to website managers and editors, to improve the way information is presented and webpages are organized, as well to increase users’ engagement.

Only one out of 6 organizations stood out in using an advanced configuration of Google Analytics to create different reporting views, filter data, track goals and conversions.

Google Analytics too often stands alone 

I have highlighted before the importance of a digital analytics and measurement plan - and how Google Analytics may eventually just be a part (even if the most important) of your data collection and analysis system. On the contrary, I didn’t find a lot of this in the websites I’ve recently reviewed.

On the one hand, while Analytics is the default tool to track digital analytics, in most cases is also the only monitoring tool. On the other hand, when digital analytics are collected from different sources, (e.g. website, newsletter, RSS feeds, social media, etc), more often than not they are not presented and analysed in aggregate. Finally, not all organizations are regularly producing actionable reports on the basis of their analytics, to inform future actions and improvements on the website.

Only one organization presented a more advanced understanding of its’ digital analytics process, with multiple data collection points (e.g. website, newsletter, RSS feeds, social media, etc) that fed into a dashboard spreadsheet, using formulas and calculations to avoid double counting and over-reporting of metrics. Even if there was no document describing a strategy, this is already a great step towards more efficient use of digital analytics.

What can be done? 

I think a lot could be achieved through the availability of more specific content for international development and the open exchange of experiences around digital analytics for the development sector.

The majority of information and guidance available online, while comprehensive, in general tends to focus on e-commerce and more business oriented websites. Other sources such as the Digital Analytics Programme (DAP) provides a good example of guidance and best practices, training and support in digital analytics. However, the target audience is also very specific, DAP being designed for US Government government agencies that provide information to the public. Eventually, there is not much available that focuses specifically on digital analytics for development - and information and knowledge services specifically.

Secondly, I think website administrators and managers should be more open about how they do digital analytics, as ODI has been doing by sharing their M&E dashboard. Knowledge sharing and learning opportunities should be created for users to exchange notes and learn from each other, to identify good practices and examples that can be replicated. Ideally, I believe that web managers should also be open about the actual number of their website stats. Especially for publicly-funded websites, this would mean more transparency and the possibility to compare and benchmark different websites.

Finally, I think donors should play their part in fostering better use of digital analytics in projects and programs they fund. Besides acting as convenor for peer learning initiatives around good use of digital analytics, donors should provide stronger guidance and support in this area, to make uniform data tracking and collection across different projects. Ideally, for donors’ funded websites and knowledge services, there should not just be the mention of few, poorly selected web metrics in the project logframe. A digital analytics and measurement plan should be developed as part of the project inception phase.

In the next post in this series, I’ll look specifically at what metrics and indicators could be most useful, amongst the dozens available, for development websites and knowledge services.

Monday, August 22, 2016

Understanding digital analytics

In the past few months, two different projects gave me the opportunity to spend quite some time working on Google Analytics. The projects were different in scope: in the first, I was asked to review the use of Google Analytics as part of the monitoring and learning system of a think tank; the second project was part of a larger evaluation of development information services, to understand the reach and use of different websites. Overall, I was able to review how 6 different development organizations are currently making use of Google Analytics to track web traffic and users’ interaction.

digital analytics

This is the first post in a three post series where I’ll share some of the learning from this recent work. I’ll start with some key points for efficient use of Google Analytics (GA). Then in the second post I’ll present examples of how different organizations are using GA. Finally, in the third and final post in the series I’ll zoom in more into Google Analytics metrics - and measuring what matters to you.

Google Analytics - from basic to advanced use 

GA has developed a rich and sophisticated toolset over the years. It is now one of the most commonly used tools to monitor website traffic and engagement. It’s probably the industry standard for web analytics across different business domains, such as e-commerce, government, education, and development, too. While some have questioned the accuracy of Google Analytics and there’s no shortage of alternatives tools out there, in my opinion Google Analytics remains one of the most powerful tools (for everyday use, especially in smaller organisations). One of the aspects I like most about Analytics (besides the fact that it’s free…) is its continuous improvement - in terms of its own features and functionality as well as integration with third party tools (such as Supermetrics), and integration with Google Sheets add-ons.


I recognize GA features can be overwhelming: with all the information GA can track it’s easy to drown in a sea of data. Luckily, a simple search in Google will return a lot of results pointing you to tutorials, guides and training videos that will allow you to go well beyond a basic knowledge of Analytics. If you’re new to Google Analytics, or want to take it to the next level, I suggest you take a look at these resources:

As a minimum requirement, your correct installation of Analytics should include:

Analytics measurement plan

It’s relatively straightforward to get your Analytics set up properly and tracking data. But this is just a small part of the job. In fact, even before you get going with Google Analytics (or any other web analytics software), what you need is a measurement plan.

A measurement plan is a document that:
  • Defines your business objectives and outcomes you want to see; 
  • Presents the strategies and tactics to reach these outcomes;
  • Illustrates the metrics you need to monitor and the tools and processes to collect data; and
  • Includes goals and targets for your selected measures. 
In this sense, Google Analytics is only one of the different tools that you have to measure your objectives and outcomes. If you’re active on Facebook and Twitter or publish video on YouTube, these should also find a place in your measurement plan.

There’s a lot of good resources out there on digital analytics and measurement planning. You can check Google’s own guides or this post from analytics specialist Avinash Kaushik. Finally this guide presents a great intro for non-techie on how to create a measurement plan for Google Analytics.

So as essential as a measurement plan may seem, my recent work experience tells me that this is probably more the exception than the norm in development organizations. I’ll discuss more about this in my next blog post. In the meantime, how would you consider your current use of Analytics? Are you using it to its full potential? Do you have a digital measurement plan? Let me know in the comments below!

Thursday, July 21, 2016

Building a knowledge portal through Open Data

The Open Knowledge Hub (OKHub) is a collaborative initiative led by IDS to make good quality research accessible in an original way.

In its essence the OKHub is a “database of open-licensed metadata (bibliographic data and links) about research documents, organisations, and other materials.” Around 20 knowledge partners such as Eldis and 3iE contribute their content to the platform, including titles, URLs, abstracts and summaries, keywords, etc, of the research publications in their catalogues. To date, the OKHub contains over 20767 documents. You can browse and search this wealth of information by different criteria (e.g. themes, languages, regions and countries, etc.) on the Content Explorer.

But collecting, aggregating and organizing this global content is only half of what the OKHub offers. In fact, the OKHub uses the same open infrastructure and technology to allow you to use its content to set up your own knowledge services. Services such as BRIDGE and the Gender Hub are integrating OKHub contents to expand their online collections.

Earlier this year I supported the development of a prototype website that makes use of the OKHub dataset and functionality to presents selected research on Challenges to Development in the Arab World.

Setting up the prototype 

The OKHub offers functionality for developers and site ‘builders’ to re-use its content. You can use a simple HTML widget to display selected resources from the OKHub catalogue. Alternatively, if your website is built on Wordpress or Drupal, you can use a plugin to seamlessly import selected contents from the Hub into your own site.

For the Challenges to Development prototype, we experimented both with solutions. Eventually, as the site is build on Wordpress, we downloaded and installed the OKHub plugin to import around 140 free Open Licensed content items relevant for the 10 key issues covered by the prototype. These contents are aggregated on the resources page and presented separately on each thematic page. 

Together with this open content imported from the OKHub, the proof of concept also provides two spaces for content creation and curation: a section to present featured publications and a blogging space to share relevant highlights from the MENA region.

This project was rather short and straightforward, but there are three key lessons that I think it’s worth sharing.

A business case for the OKHub initiative and platform 

Actually, two. On the one hand, as knowledge producer or intermediary, you can make use of the OKHub technology and infrastructure to contribute the content of your organization, thus increasing its visibility, availability and accessibility. On the other hand, The WP plugin has huge potential, as it allows non programmers to easily import content and augment their own knowledge service, or create a new one.

The human factor

Open content and automation alone are clearly not enough. If you want to maximise the chance of research uptake, the human factor is key. This means using a moderator with the required regional or thematic knowledge for quality control purposes and to tailor imported content to specific stakeholders. But it also means having resources to create your own original content, to curate and repackage existing content, to build and animate a community around your service, to ensure users are interested and engaged.

Tech for the (non-)techie

The HTML widget and WP plugin enable less technical people, with a basic knowledge of HTML and CMS, to “plug-and-play” and build applications which meet their needs. However, you may still need some programming skills, to be able to fully integrate OKHub content on your own site. In my case, there was a conflict between the WP plugin and the site theme, resulting in individual records not fully displaying, or altering the site layout. Thanks to our colleague Tony Murray for stepping in and getting well beyond where my technical knowledge ends!

Overall, the prototype offers a good proof of concept for the idea that open knowledge and collaborative approaches can help extend outreach and uptake of research knowledge.

Do you know other examples on initiative of knowledge services based on Open Data to sharing and use development research content? Let us know in the comments below!

Monday, July 18, 2016

Audit your Google Apps with GAT

For a couple of years, we’ve been supporting the deployment and adoption of a KM platform for the Collaborative Adaptation Research Initiative in Africa and Asia (CARIAA). As the programme has evolved and matured, so has the platform, with almost 500 user accounts, and anecdotal evidence of its usefulness to support knowledge sharing.

But besides counting the user accounts created, what is really happening on the platform? Can we learn more about the users? What are they contributing? Is there any champion emerging? What Apps are most used?

Go beyond Admin Reports 

Google provides its own reporting functionality through the admin panel. If you have a domain admin account, you can access Reports and track Apps usage, security, accounts activity, etc. The reporting features are rather rich and are a perfect fit for ongoing monitoring of the platform. However, Google Reports allows you only to look at data for the previous 6 months period - which is probably not enough if you want a comprehensive picture of how the platform has evolved over time, and what users have been doing with it.

For this purpose, the best solution we could find is the General Audit Tool (GAT).

Launched in 2010, GAT is primarily an auditing and it monitoring tool. It allows you to audit or report on over 250+ separate items for users, documents, email, calendars, sites, groups, etc. Additionally, it counts users’ collaboration activities and calculates a ‘collaboration index’ across your domain, using multiple indicators such as file shares and file visits. Finally, you can set up alerts to get notified if domain policies are not followed - for example, when documents are shared outside the domain. The animated video below provides some more background information on GAT and what it is good for.

GAT comes with a cost, depending on the number of active Google Accounts you have on your domain. However, it also offers a full features trial. If you are using Google Apps, I recommend you test it out and find what it can do for you.

How we used GAT 

GAT helped us to extract a large amount of specific information on users, the frequency they interact with the Apps, and how they work with other users.

We run several daily GAT scans over a period of two weeks and exported several datasets from the Apps and metrics we had decided to include on our analysis. We then loaded this data into Tableau, to be able to aggregate it, segment it, analyze it and make sense of it through charts and tables.

You can read below here some highlights from our analysis:

  • The growth of and demand for new accounts has been steady and well beyond the initial expectations. 
  • The majority of users are active, with 75% of them that logged onto the platform at least once in the past 6 months, and over half of them in the first quarter of this year.
    Date last login
  • The use of the platform has been increasing over time. However, this use is unevenly distributed, with some users clearly emerging as platform champions
  • Google Drive is by far the largest app is terms of usage and the most frequently accessed by users, followed by Calendars and Hangouts. Drive currently hosts over 23K files and folders. The primary function of Drive appears to be to store and archive documents; the creation of new content is secondary. About 50% of all files on shared Drive have been created elsewhere and then uploaded onto Drive. 
  • An increasing number of users are viewing and editing documents on Drive, confirming the adoption of the tool. However, collaboration appears to be limited to a small number of documents, while the great majority see a small number of ‘actions’ (views or edits) performed by an equally small number of users.
    Docs overview: number of users, edits and visits per quarter 
  • In several instances, users are contributing to the CARIAA platform with their personal Google account instead of their CARIAA account. This has potential negative implications in terms of sharing settings, document management and overall platform M&E. 
This is very much a work in progress, we’re learning as we go and constantly testing out new options. What’s good about our progress so far is that we’re generating the kind of longer-term, trend data that really helps us provide support and the client to adapt to evidence about pattern use. And once it has been set-up it is not too time-consuming.

Of course a lot of these features are available in those expensive all-in-one packages used by commercial organisations and the deeper-pocketed big NGOs. But it’s hard work keeping up with trends and providing accurate, useful, timely data on a smaller budget, one more typical of the mass of Development players. So help us - what tools and approaches have you found useful and can share? And are your clients or service users listening to you and the data and changing how they work?

Wednesday, July 06, 2016

Making change happen - KM in a WSH team

It is cheaper and easier to change information flows than it is to change structure.
Donella Meadows

Identifying what causes change in organizations and attempting to identify the impact of specific projects is the kind of conundrum that keeps consultants and academics in profitable and engaging work. To borrow from Outcome Mapping language, it’s a major step to be able to identify whether those people or organizations directly connected to a project, within its’ potential sphere of influence, change their behavior and work differently in ways that could at least be linked to the activities in the project.

Donella Meadows’ seminal work, “Leverage Points – Places to Intervene in a System”, in particular her description of the role of information and feedback loops, was one of the framing ideas for a review workshop of the KM project in the Building Demand for Sanitation (BDS) portfolio. Meadows’ work explores systems, their complexity, and the enormous effort and time required to achieve lasting change. Meadows’ work highlights the importance of power and paradigms, reinforcing the central importance of leadership, a point we’ve made consistently in this series of blogs.

The WSH team of the Bill & Melinda Gates Foundation organized the two-day workshop in September 2015. Its’ purpose was to review 18 months of KM initiatives by the BDS team, as well as Foundation-wide KM experiences, and consider activities for the WSH team as a whole that would lead to stronger networks among foundation and grantees, improve availability and access to specific knowledge, and strengthen the organizational culture, improving the flow of knowledge.

The whole WSH team was involved in the workshop. The BDS KM team shared summary findings from a grantee survey, giving responses on elements of the BDS KM program they valued and whether or how it had affected their work. This graphic below illustrates the relative valued add of program activities, according to grantees.

(click on the graphic for a larger view)

And although 18 months is a short time in which to achieve the more fundamental changes in behavior that are the basis of sustainable change, there were clear indications that grantees believed the BDS KM activities were helping them integrate more effective KM into their work. For example, from the pre-program survey in 2014 we identified grantee KM priorities and in general, in the concluding survey, grantees rated the project’s impact positively. 

(click on the graphic for a larger view)

Mainstreaming KM into the Rhythm of Business

Everyone in the WSH team had ideas and experiences to share, so much so that when it came to prioritize proposals, a senior member of the team responded that he felt almost promiscuous because there was so much that turned him on. It’s hard to summarize such a free-flowing, well-informed and thoughtful conversation but the remarkable graphic facilitation of Nancy White at least conveys some of the richness.

The main theme that emerged was the necessity of integrating KM in the normal ‘Rhythm of Business’ (RoB). There was a consensus that KM has to be ‘mainstreamed’, not seen as something discrete, made up of specific periodic activities. The most fundamental recommendation was that the WSH Director would ask WSH team members to put KM activities into individual goals on basis of common team goals to be developed by management, based on a menu of Key Performance Indicators (KPI) to choose from. This would be supported by including KM in the job description for the then-about-to-be-appointed Deputy Director for Strategy, Planning and Management.

The team agreed also to determine how best to incorporate KM into the grant management cycle, and include it as a standard item on regular ‘Feedback to Action’ meetings. For example, two members of the team planned a pilot of a peer-assist format for part of an upcoming meeting, and they agreed to communicate lessons learned back at the next meeting. Finally, the team planned to institute regular meta-analysis of grant results, one or two times per year, which would feed into the planning process.

Active curation of information and widening access to resources behind paywalls was another theme. The team agreed to put resources towards a service or function that replicated the ‘Curated Updates’ experiment run throughout the BDS KM project. There was also a commitment to exploring how grantees could benefit from Foundation access to publications.

As ever, the longer-term impact of the workshop, and the KM project more generally, will probably be more influenced by the ‘normalization‘ of the concepts through the commitment of so much time to discussion, and the personal engagement of staff in the issue, very much led from the top. The WSH team have committed to reviewing their progress on improving KM, so expect some more blogs in due course.

Meanwhile, what about long-term behavior change in organizations that has demonstrably improved knowledge flows, learning and information management: do you have any examples or ideas?