The Ergonomics Society is about to embark on a redesign of its website, and ealier this month I posted out the initial user segmentation model, along with the draft user profiles and the prioritised scenarios. Now, following conversations with various folks including Tina Worthy and Richard Bye, we have an updated plan for user research.
In summary, what we plan to do is:
- Establish some baseline data for the existing site experience (so that we have something to compare with after the redesign). Richard Bye has kindly offered the use of his analytic tools in assessing this.
- Perform depth interviews with participants from the 1st four priority segments, as follows:
- Information Consumers (times 3)
- Society Members (times 3)
- Society Customers (times 2)
- 3rd Party Service Consumers (times 2)
- Note that the breakdown here is designed to reflect both the relative priorities of the segments and what we feel is realistic given the resources available.
- Hold a focus group for the Staff Information Consumers.
- Run a formative IA exercise (such as an open card sort) to establish the key organisational principles for the site content. Participants to be segmented as in (2).
Evidently, there will be a fair amount of prep involved in all of this, notably the preparation of recruitment screeners, interview protocols, scripts, etc. Note also that the analytic tools that Richard has offered will also need configuring; no doubt a key part of this will be determining precisely what metrics to measure as a baseline. I suspect we’ll need to adopt a pretty lightweight / agile approach, especially considering that most if not all of this will need to fit around existing work commitments. And we shouldn’t underestimate timelines either – it is one thing to manage delivery of a web project when everyone is directly accountable to you; quite another when everyone is lending their time on a voluntary basis.
Looking further ahead, we will also need to consider the choice of development platform. At the moment we are using phpMyAdmin, but it is likely that we will want to migrate to something more scalable and usable by a wider cross section of people (i.e. nominated content editors) in future. Lauren Morgan is currently evaluating alternatives such as Joomla and Drupal, and should be in a position to report back soon.
So, as a rough estimate, I’d say the timeline will pan out something like this:
- August: user research
- September: user research + data analysis. Output = refined segmentation model + profiles + scenarios
- October: Interaction design + visual design (proceeding in parallel in so far as that’s practicable). Output = wireframes (which could be fairly simplistic, depending on the build approach) + visual design spec. (NB we should also consider producing a style guide for the site, but I am not sure we can deliver that as well within the scope of the exisiting project)
- Nov + Dec: build. Output = CMS templates + associated tools & resources, etc.
- Jan: UAT + soft launch
- Feb: full launch
Note that I’m assuming we will interveave user feedback at suitable iteration points throughout the above timelime – as UCD specialists we should know this better than any 🙂
Thanks Tony
Sadly my remote usability testing tool is locked behind our organisation’s firewall. But we have a contingency plan (more to follow). Assuming we can run a remote unmoderated baseline usability evaluation, I would suggest that we keep it pretty simple by using the ISO 9241 standard metrics of effectiveness, efficiency and satisfaction. It would be quite straightforward to capture task completion rates, time on task and subjective satisfaction across a prioritised list of scenarios. We should also consider running a (light) survey to ask visitors why they are using the site (and other questions) either as a screening questionnaire for the remote usability testing, or as a separate activity.
In addition to the remote testing and the contextual enquiry activities you listed, I think that we should get web analytics up and running as soon as possible. It should be easy to install Google Analytics (or similar) to give us a much better insight into current site usage. I’m assuming that we don’t already have this as it has not been mentioned.
I also think that we need to outline the site objectives so that we can be sure that we are directly addressing current problems and working towards the society’s strategic objectives. It’s still really early in the process but the sooner we can agree on a UX vision (or Parti if you’ve seen Luke Wroblewski’s stuff) the more likely we are to target our scarce resources in the right places. For example I’m not entirely sure of the site’s high level goals. Are we designing to support (a) improved brand presence, (b) targeted marketing campaigns, (c) better content provision or (d) effective tasked based applications?
If we try to focus on all of these areas there may be a risk that we’re trying to be all things to all people. Looking at the prioritised scenarios we could say that our core objectives are to increase:
– subscription renewals by x%
– membership applications by x%
– membership upgrades/progression by x%
– event registrations by x%
– advertising revenue by x%
– the quality and volume of content contributions by x%
– the ease of finding trusted ergonomics information
– the perceived status of the ergonomics brand…
– etc
But we can’t focus on all of these things for release one, can we?
Cheers
Richard
PS what’s the plan for writing up the user research interviews? I’ve never yet found the time to do in depth qualitative content analysis on live projects as it’s rare (unheard of) that I record and transcribe interview notes. I normally go straight from rough notes to concepts and themes but we’re not likely to have the luxury of a room full of post-it notes to do this collaboratively. We could use the Google docs method of live affinity diagramming over a phone conference, but in the first instance it may be worth populating an issues log so that we have a single spreadsheet to capture all the issues. That way we’ll have a single resource containing notes from all heuristic reviews (inc the notes on the LinkedIn discussion), usability testing comments, interviews and other sources. We can then turn this into a prioritised set of user requirements pretty easily. What does everyone think?
And should we be using Basecamp (or similar) for project planning and collaboration?
Thanks Richard – very useful stuff. Taking each point in turn:
* Not sure how remote testing works in practice, but agree on the use of standard metrics. How much overhead is there in setting this up?
* Agree that a survey would be good – but I presume this would only apply to bona fide site visitors (i.e. not test participants)?
* Great Q re site analytics. Tina – can you enlighten us as to the current situation and options?
* Re UX vision, we did derive one during the 1st session at the stakeholder meeting (see https://isquared.wordpress.com/2009/07/06/audience-segmentation-model-for-ergonomics-society-website/ ) – Agree that this would benefit from further analysis, though.
* I think a separate session to review the themes from the interviews will be needed, but you’re right to ask what form the input will take. I’d anticipate we’ll all have to spend some time translating the interview notes into concepts & themes first.
* Like the idea of an issues log. Did you have a particular tool in mind? If it’s online & multi-user we’ll need something that has some version control & workflow associated ideally. We’l also need some way of reconciling its content with that of the prioritised scenarios (https://isquared.wordpress.com/2009/07/14/user-scenarios-for-ergonomics-society-website/), so that we have a clear overall picture of our requirements and design priorities.
But my biggest fear is that we’re taking on too much here – I know it’s all good UCD but what we don’t have is the reporting structure and human resources to make all this happen. We have only goodwill (which we shouldn’t take for granted), and voluntary time. With that in mind I worry that the scope may expand beyond what we can reasonably support. Perhaps the next step is to put names against the proposed activities, so as we can sanity check how we will resource them?
Hi Tony
I share your biggest fear about scope, time and resources. I’m sure that the society exec have a desire to create a website that reinforces the brand and contributes to the achievement of strategic objectives. But I’m not sure about the level of central investment in terms of time, budget and support to drive this through. I’m not saying for a minute that the support isn’t there but if you (as project manager) are not sure of the resources at your disposal (people, carrots to buy services, sticks to get things done) then we’re heading for trouble.
We have an relatively amateurish site now by current standards, do we want a new amateur site or something that we can all be proud of? I guess that I’m just testing the water a bit because the user research phase is easy compared with all the work to follow (IA, IxD, visual design, usability, platform selection, development, content strategy, copywriting, content management, etc, etc)…
I think that you are right to say that we need a set of milestones and a to-to list so that we can allocate people to tasks in a cohesive fashion. I understand why you want to keep the discussion flowing on here (the blog) but it’s too slow and unfocused to get things done via what is currently functioning as a public email chain. We do also need a proper collaboration platform like Basecamp.
So my proposals are:
· Find out what the senior ergonomics society stakeholders think about the project and how they intend to help. (assuming that you know this, perhaps you can share)
· Set up a project collaboration site on Basecamp so that discussions focus on getting things done
o Create a project plan
· Invite those who are actually motivated and able to help on to the Basecamp site and assign actions
o Then we’ll know what achievable scope and timescales look like
· Regularly report progress, hold discussions, post questions on your blog for wider engagement
Our other discussions and questions relate to the hows and whys of particular methods (remote usability, survey, etc). Happy to discuss these further but I’ll do this in a separate post.
Cheers
R
Hi Richard
Thanks for the feedback – again, I think you’re right on the money with these questions and suggestions.
Re your first bullet point, what in particular did you have in mind beyond what was expressed at the stakeholder workshop (and covered in my earlier posts)? At risk of misreading you I’d say the quick answer is that the support is there but the understanding of what it takes to move from we are now to a more professional operation is limited (but as I said, I may have missed your point here). What is clear is that up to now the engagement model has been one of ‘delegation by commitee’, which might work fine for small-scale projects, but won’t scale for what we have in mind. I share your view that this project is too important to undertake without proper direction and project management.
Basecamp: I agree – let’s just do it. I’ve not used it before tho, so if you have would you mind setting it up? I think once that’s done we can start to adress the other items on your list, and use that as a platform going forwards. Is there a cost implication? If so, we’ll need budget approval (Mark, is that your dept.?)
Cheers,
Tony
The Basecamp site is up and running. We need to firm up the project plan and start to assign tasks…
If anyone would like access to the Basecamp site please let us know – now is your chance to get involved (if only to see what’s in the plan)…
Cheers
Richard
Thanks Richard. Haven’t quite got into the Basecamp habit just yet but will give myself a notional kick up the wotsits tomorrow. Will also send an invite out to Alison at Which and anyone else who expressed an interest, e.g. the folks on LinkedIn.
Cheers,
Tony
[…] Alternatively, consider using one of a number of online tools for card sorting. Initially I was a little sceptical of these, as most are chargeable services, and with remote testing you’re going to miss out on so much qualitative feedback. But I have been reasonably impressed with Websort – it gets down to business very quickly and greatly simplifies the remote card sorting experience. We’ll be using this tool for our IA work on the Ergonomics Society website. […]