June 29, 2004
Agile is a software development movement that aims to cut fat from the industry's trademark bulky timescales and bloated specifications. Its been devised by 'developers' or 'programmers' and as such has so far been pretty thin on procedural detail for the user experience designers. In fact, some of the specific methodologies within the Agile family make absolutely no mention of 'experience design' at all. Most advocate lines of code being written from the very beginning of a project. Almost all state that you work through a list of requirements until you have the minimum amount of functionality to release a product. However, few go into any detail about how you get to the point of having requirements in the first place.
So, this is where a project team used to working from the very start of a project (determining what it is that needs building) can go wrong. Trying to lay lines of code before you have some idea about why or for what purpose is like taking a journey to a foreign city you've never been to before, without looking at a map first. You may hit the right continent if you have some vague sense of direction, however, all roads do not lead to Rome.
But there's more to Agile than laying lines of code as soon as possible. Its a mindset which can rid the shackles of rigid 'design first' processes that pride themselves on quality but have done more than their fair share of creating the software industry's bad reputation by delivering late or not to specification or not within budget (or a combination of any and all of these). The Agile Manifesto is a set of fundamental tenets to remind Aglie practitioners to "Get On With It". And really, thats what it boils down to, regardless of which role you fulfill in the team.
In my recent experience of working on a project that was very much at 'seed' stage when we began trying to work out which of the many manifestations it could take on, I found that an Agile mindset was an effective mechanism for ensuring that rediculously tight deadlines would be met. Being Agile means that the perfectionist that lays within most experience designers has to be persuaded that it's OK to get it right the second or even third time. Taking a 'sketch approach' to design where the first versions are a best guess based on the knowledge available at that time, abates deliberation for fear of getting it wrong. Watefall processes are frought with fear of the bad specification. Once coded, a design is set in stone, not for revision. Technical folk usually get very cranky when you sheepishly sidle up to them with "You know that section here... the one that took you weeks to code, tweak and get just right... well... um... we er, need to um, change it... y'know, just a little bit..." [ducks for cover]. However, when the project team all agree up front that everything is subject to change, it loosens up that feeling of foreboding which has a side-effect of making people check thrice before committing to a specification. Naturally, changing things takes time, but when things take less time to design in the first place, you have more time to iterate until it is right.
A coding practice known as Refactoring can liberate us all from what Alan Cooper refers to as scar tissue, a phenomenon where code that has to be changed leaves a 'scar' that makes the foundations of the program unsound. It can be argued that working in an object oriented fashion should enable components of a system to be rebuilt with no impact on the remaining body of code, therefore should be the method of choice when following an Agile development process.
The Agile shift in thinking about experience design is not just about being iterative. Its fundamentally about designing in an object-oriented fashion too. Components need to be designed individually in order that they can be built individually in a staggered project timeline. Some components will be a few degrees more defined than others at any one time and this is hard to reconcile at first. Its uncomfortable thinking that you have to design a mid section before a start section etc. Some might say you can't possibly do it and expect to get it right. And there's the crux of the mindset right there. You don't expect to get it right the first time. You expect to get it right the second or third time around, when you have had time to design the start section.
Agile works, but the idea of laying lines of code from the outset of a project, does not seem like a great idea to me, for reasons stated earlier. I think this is where a production process can learn from User Centred Design's 'understand' or 'discovery' phase to flesh out business requirements, user requirements and the competitive marketplace. I would still advocate that the body of work that is done in this period, is managed using an Agile methodology like SCRUM though, as it still works to 'gee the team along' even when they're not laying lines of code. The rhythm set by being iterative, combined with an disparaging attitude to excessive documentation and ecouraging face to face dialogue, is a very effective framework to ensure maximum efficiency within the team.
It does take a significant 'opening of the mind' to allow ourselves to really work in this way, when everything we've known to date says we're going to get burned if we do. But it's inevitable that you get burned the first time you try new things. We must persevere, because unless the software and web industry embraces a total shift toward delivering on time, to spec and on budget, clients will continue to lose faith in its ability to get itself together. This will affect its monetary value in very real terms. If the tech sector's stock price collapses, as the survivors the dotcom crash know only too well, it will really, really hurt.
January 16, 2004
Theory as far as I am concerned, then, is best understood as an emergent property of practice. Theories are in part post hoc rationalisations—the plausible stories which we tell ourselves to account retrospectively for our actions.
If only I could explain myself with such impeccable clarity! This paper entitled Theory for Practice by David Sless shows that when somebody can explain a concept or set of concepts without relying on jargon or buzzwords, it is extremely powerful. This is because it demonstrates a full understanding; to the point where there are no holes in comprehension of the explanation of the presented concepts due to the use of the syntax of an exclusive club.
Aside from the writing style of this article, the concepts within are also so very pertinent to what I've been thinking/writing about with process and design theory. Not only does it suggest that rigid process isn't as valuable for solving problems as it is for teaching people how to solve problems, but it also articulates my feelings about moving from a linear way of working, to an 'inspect and adapt' or Agile mindset... and that's just a little bit of it. if this is 'Constructionism', I must find out more about it.
Abstract from the paper: This paper discusses a constructionist approach to information design and contrasts it with the more widely used constructivist approach. The paper suggests that there are five principles of information design: politics, position, parsimony, politeness, and performance. Of these, politeness is the most important.
December 22, 2003
XPDAY - Mary Poppendieck, Tom Poppendieck - Lean Development - Tuesday 2nd December 2003
Waste is anything that doesn't create value for the customer. If the customer would be as happy without something, then don't do that something.
Value Stream Analysis - [Ant see this entry for a little more on that]
For fixing troubled projects, increase feedback at customer, team and management levels. A lack of feedback somewhere in the chain is the most common area for improvement in project debriefs.
Make change inexpensive. You can do this by:
- delaying commitment
- sharing partially complete design information
- developing a sense of how to absorb changes
- avoiding extra features
- developing quick response capability
- developing a sense of when to make decisions
Group what is likely to change together inside one (code) module so that interdependencies are intrinsically linked.
A module should have only one responsibility. Don't make them dual purpose.
Don't repeat yourself. Never copy and paste code.
Functionality doesn't have to be made till the order is recieved.
Deliver fast - rapidly, reliably, repeatedly
- Wait till there's a need before you start to build. This is a good incentive or way of 'pulling' your requirements out
- Don't 'push' with a schedule, you can't keep them up to date. Too much in a project's life changes and its wasted time updating them.
- Make work self-directing through the use of a visual workplace whereby staff can see what state elements of the project are in
- Rely on local signalling (like a Daily SCRUM or other methods for communication between team members)
- Do small batches
- Steady rate of arrival (short iterations)
- Steady rate of service (test features immediately)
- Small work packages (integrate features individually)
- Reduce utilisation (you don't load servers to 90%... productivity in staff drops off exponentially after around 70-80%)
- Eliminate Bottlenecks (Everyone piteches in, when and where ever they are needed... yes this means multiskilled staff)
Lean means introducing a gating mechanism so that full capacity is never reached. Lots of half done stuff IS BAD and is the result of overloading. Just as a highway gets choked up when there's too much traffic, the same happens in a workplace when it is overloaded with work.
Build Integrity In
- Perceived – totality – overall customer delight
- Conceptual – the system's components work well together
The way to build both perceived and conceptual integrity is to ensure all aspects of a business are included in the design process. There are 4 questions to answer which cover all aspects.
- Why are we doing this? This is the business drivers question
- The Vision
- Success Model
- What needs to be done?
- User Experience Design
- Acceptance Tests
- User Tests
- Use Cases
- How do we build it?
- Programming methods (e.g. XP)
- Technological platforms
- How do we support it?
- Ask your support team to be involved in the design process
Empower the team
Kaizen Events are a way to bring a multidisciplinary team together to improve a process or product. The order in which they are done is as follows.
- Bring team together
- Set the challenge
- Brainstorm Solutions
- Present recommendations
- Decide at a "town meeting" (merely where all stakeholders are consulted)
- Implement immediately
December 10, 2003
XPDAY - Barry Fazackerly - Enterprise XP - Tuesday 2nd December 2003
Enterprise XP is the bastard child of DSDM (Dynamic Systems Development Method) and XP (Extreme Programming). DSDM is another agile process, however, like SCRUM it is more concerned with overall management than it is with programming practices. DSDM however pays fairly rigorous attention and deference to the business case behind a software development project. In this regard it is more of a business analyst or consultant's saviour rather than a programmers. Enterprise XP endeavors to deliver the best of both worlds.
Always ask 'What is the Business case? What is the Return On Investment? What are the measures of success so far as the business is concerned?' A lot of these concepts are captured in The Balanced Scorecard management approach.
The running order of an Enterprise XP project would look something like this
- Feasability Study
- Business Study
- Functional Model iteration
- Design and build iteration
XPDAY - Martin Fowler - Keynote - Tuesday 2nd December 2003
Martin Fowler started controversially by announcing that he was a bit sick and tired the Agile world and process methodologies. He used book writing as an example and posed the question: When writing a book, do you do an outline first and then get steadily more detailed? OR Do you just muck in and get busy writing detailed sections and string them all together later? Both methods are 'successful' and merely the personal styles of two different authors. How do you objectively measure the success of the outcome of either of these methods? Isn't it completely subjective? To what do you compare or measure? [Ant - Here I would suggest that perhaps setting your own success criteria is enough providing you do so before you set out to do the work].
Martin is more interested in knowing what makes good software design and how to achieve good software design. SWEBOK (SoftWare Engineers Body of Knowledge) is attempting to define what a software engineer should be knowledgeable about. What would a software university curriculum look like? Analysis, Construction, Design, Testing, etc.
The practice and science of Engineering separates design and construction. Agile, especially XP, does not. Agile advocates doing design and construction together so that each informs the other [Ant - this is only possible because the only cost of building code is man hours and not materials. If physical construction was 'Refactorable' would we engineer and build together too?] Design can be an evolutionary process. UNIX is an example of successful evolutionary design [Ant - although, is the success of UNIX due to the fact that it's free, therefore there is a pretty natural incentive to adopt and grow it or is the success of an evolutionary design that draws people to it?]. Testing, refactoring and continuous integration allows evolutionary design to happen. XP has these three factors that enables evolutionary design to converge.
Only design for the current set of requirements. Future proofing or anticipatory capability isn't so good because extra features and code complicates the system, thus crippling future efforts to adapt and extend the system later. Keeping it simple will enable evolution.
Evolutionary design can work because there are people on the team, willing and able to do it. Teaching design is about showing others how to see problems first. Then over time and apprenticeship, the art of making solutions to the problems can be learned.
You can tell when evolutionary design is happening because the team is motivated, code is being thrown away and the team can freely say 'this isn't working, lets change it'.
December 09, 2003
XPDAY - Richard Watt, David Leigh-Fellows - Acceptance Test Driven Development - Monday 1st December 2003
How does a coder know when they are done? The natural tendency for a developer is to continue to perfect a piece of code for long after it has attained an acceptable level. The Unit Test ascertains whether a unit of code works. A Functional Test is used to ascertain whether a group of units are working together to satisfy a desired function. But these are both simply testing that the code is working in a manner that the developer intended. The Acceptance Test asserts the conditions of acceptance set by the customer or designer. They are less concerned with finding imperfections in code as they are with identifying the result of an imperfection. Acceptance tests relate better to requirements or stories than they do to units or objects at code level.
Acceptance tests are a way for the developer to know when they're done. Where automating acceptance tests is achievable, this should be done. A developer can keep working until the testing application gives 'the green light'.
Acceptance tests should be written at the same time as stories are specified. This helps to set the goal posts for and aid in estimating particular functionality for all disciplines involved in the process of designing and building software.
Acceptance tests are hard for many reasons. The customer is involved in defining the test. They are not always adept at structuring such a test framework and will thus need guidance on this. Acceptance tests raise the level of quality assurance support required on the project team. Writing the tests are time consuming. Maintaining the tests is also time consuming and requires careful management and organisation to ensure this is as efficient a process as possible. Setting a common framework for testing can be difficult across multiple work streams. So, acceptance test driven development is not an easy road, however it is worthwhile as it ultimately saves time, eliminates waste and increases quality.
A typical chronology for an acceptance test driven framework would look like this:
- Select candidate stories for iteration or development
- Write the acceptance test skeleton for each
- Sense check the candidate stories
- Using the iteration planning workshop as a starting point:
- A small multidisciplinary group takes a few stories aside
- The group lists the subtasks associated with each story and then provides an estimate on how long this will take to do
- The group then presents back the subtasks and estimate to the wider group for critique
- A small multidisciplinary group takes a few stories aside
- Check story dependencies and estimates
- Pick highest value group of stories
- Agree stories for iteration
And there you have how Acceptance Tests should be integrated into an XP process.
December 08, 2003
XPDAY - Sean Hanly, Duncan Pierce - Introduction to XP - Monday 1st December 2003
Values of XP
- Courage = Get on with it. Be honest about your abilities. If it's not working, change it, refactor it, throw it.
- Simplicity = Keep all things as simple as they can be. Investment in XP products is incramental.
- Feedback = Monitor performance, inspect and adapt. Measure early returns.
- Communication = Focus on info that matters, don't do unnecessary documentation.
Agile is a mindset
Agile is about making incremental investments – small changes to serve longer term goal. Nothing Agile is done in massive increments. So, changing to an agile methodology should be the same. Small incremental steps toward a better solution where each step has been evaluated for success. Change one thing at a time, do things simply, ask 'what already works?'. There is already good in what you do, don't throw it all away. Make a series of small process changes. Using XP is based on these principles - get feedback, measure benefits, inspect, adapt.
When prioritising features, look at which will make for early returns and assess which of those deliver most value to the customer.
When designing and building, see good enough, not perfect (this pertains to the type of feature, rather than the quality of workmanship).
Organic processes work, not just the mechanical and systematic. Order within teams can rise from chaos if the team is empowered to organise themselves.
Don't be afraid to measure progress, make it visible and have courage about honesty around it. Progress indicators are only as valid as the last measurement. Assess the progress over one 'time box' or 'iteration' and then estimate the next based on progress or 'Velocity' of the last.
In XP there are two types of planning sessions. One for each iteration, and one for each release. When using XP you assume that you are not going to release full functionality in the first release, but build it up over a series of releases. The Release planning meeting should have the 'Customer' (This is a term used within XP to denote the sponsor of the project), developers, tester(s) and an interaction designer all present. Each requirement is captured on an index card as an abstract description of something that the system will provide. Each of these estimated in terms of required effort and value. One effort unit is usually equated to a perfect day's effort with no distractions. This unit is then given an arbitrary descriptor (e.g. Gummy Bear) to abstract future indication of velocity away from any expectation of time scale. The value indicator equates to the relative importance of each requirement over the others. Iteration planning involves writing acceptance tests (even if they are not perfect) for each story and ordering the stories according to value and effort. In this case, the team must balance what can be achieved in the iteration with which story has the most value. Where possible, acceptance tests should be automated with development and refactoring carried out only until the unit of code passes its test.
A good story looks like this
- valuable to the customer
Tests for enabling the project team to know when they are done are a crucial and useful to aid in setting targets. There are three levels of test. Screen level tests cannot be automated and involve user testing for comprehension and visual checking by the design team to ensure they are as specified. Interaction tests can be automated and test whether certain use cases work, technically. Engine tests are base level code and can be automated. A tip for writing all tests: make the language and labeling as consistent as possible, as far up as the customer level and as far down to the object (code) level. This will make less a need for documentation and aids communication within the team.
XPDAY - Mary Poppendieck - Keynote Monday 1st of December, 2003
Mary's talk mainly focussed on the wider issue of productivity and engendering it within a workforce. She opened with a statement that productivity had a direct relationship to standard of living due to increased profits. Increased profits are derived from increased sales and history has many examples where this is evident.
The first steps to increasing overall profits within an organisation is to focus on the core business practices and work toward being more productive than the competition in these areas. Then work on non-core business practices and match the competition's performance on these.
Being productive does not mean sacrificing quality. You could improve speed and decrease overall quality or value and this would not be increasing productivity. Productivity is about putting the same effort into something and getting more out of it for changing the method in which the work is done. Or being able to charge the same amount for doing less work.
We can be productive by either reducing direct cost (i.e. what a client would pay for) or reducing indirect cost (i.e. streamlining processes and methods). The primary way to reduce direct cost in software development is in building only functionality that is required. Usually 80% of software product functionality is infrequently or not used. Each piece of functionality should have its return on investment measured and then only those yielding highest value should be built. Do the minimum marketable features then release. "Release early, release often" moves profit forward in time thus paying for future releases.
Value Stream Mapping is a good tool for analysing the way in which a business spends its time. It is derived from Japanese manufacturing process analysis. It is basically done by stepping backwards through a process and documenting time taken to do a task and time wasted in between tasks.
Overall, success should be measured not per employee, but the increased productivity of the organisation as a whole. There is a Japanese term 'Keiretsu' which encapsulates the notion of a cooperative group of related companies supporting one another. Productive for a software product can be measured by the increased revenue in the supported business per dollar spent by the IT organisation charged with maintaining it. To help design for this, it is very helpful to have those who will be supporting the product from technological through customer service, to be involved in the design effort.
November 28, 2003
SCRUM & UCD - Deborah Hartmann
This Diagram is one Deborah sent me illustrating how she's approached the effort to integrate User Centred Design with SCRUM.
We're going to continue the conversation in the comments section underneath this entry. Feel free to join the discussion.
October 22, 2003
ForUSE – The Agile Customer's Toolkit – Tom Poppendieck
Most books and publications about XP and agile are very programmer oriented. There's no place for requirements analysis, UI, or interaction design.
Writing effective use cases –
Tools for Customer side practices:
Decision Tools (how to decide what to do)
Role tools (how to organise work and team)
Story telling and customer tools
Effective collaboration is based on shared
Values Implicit belief system, vision, or mental model about desired business reality or purpose.
Principles – Guiding ideas, insights and rules for deciding.
Practices. – What do Do, Actionable.
Manufacturing metaphors for software development don't work because if you talk to manufacturers and engineers, every time something is built, the process is different and unique. Toyota will stop at every stage of the manufacturing process and test whether it is actually works.
Lean Principles... Tool 1.
Eliminate Waste – Basically, this is just be effective. Weed out the parts of process which are not.
Amplify Learning – This is what iteration is all about. Learning is more important than the code itself.
Decide as late as possible – By waiting, you get more information and feedback
Deliver as fast as possible – if you are going to wait, you have to deliver fast.
Empower the team – agile processes focus on people. Complexity means working together... as effectively as possible. Let the team figure out how to do their job, because they live and breath it and will obviously work out the easiest and most efficient way for them. It will also make them happier.
Build integrity –
See the WHOLE – It is more valuable to measure the effectiveness as a whole, rather than the individuals effectiveness. The team means that everyone is accountable. Rewards should be given to the team, not to individuals within a team.
Concurrent Development... Tool 2
Why are we doing this? What needs to be done? How do we build it? all happen concurrently. Information is done in such a way that you hand over requirements at increasing levels of detail as the project proceeds.
What do we do first? Breadth or Depth? Both. Breadth: Low detail system intent and release and iteration planning. Depth: most important features first. Working app every iteration.
Doing most valuable first is more important than doing most risky... [what is value?]. Build by feature - order by ROI. Defer commitment. Simplicity, feedback, let subsystems and frameworks emerge. Value learning over code. REal customer needs, re-planning and re-factoring are not rework.
Chartering... Tool 4
I don't know why he skipped over tool 3... something about a chair with legs.
Chartering. A team is a community and needs a purpose to exist. Individuals contribute differently. Customers > business goals. Testers > Quality Goals. Interface designer > usage goals. Developers > architecture goals. Analysis > domain goals. A common understanding of purpose is required so all understand the mission. Define success, frame boundaries, facilitate information flow. Align decisions. Mission statement or elevator pitch. Objectives outlining purpose of the project and what it's about. Committed resources. Who defines the success? What is important? What is success? scope? Schedule? defects? resources? You should be able to express all aspects of your mission on one page. If you can't say it concisely, you don't understand it.
XP Practices... Tool 5
Customer Practices: Release and iteration planning. Frequent small releases. Customer tests. Story telling. UI Model. Essential Use Case model...
Developer Practices are the Engine.
Agile Development cycles. Release planning is all about breadth. Iteration planning is a mixture of depth and breadth. Implementation planning is all about depth.
User Stories... Tool 6
Often confused with Use Cases. Stories cover everything the user cares about, both functional and non-functional.Story Card content: Title, One or a few sentences describing what is wanted written by the customer. Developer estimate of relative cost. Sample tests sketched on the back. It's a hand Written index card which has a tactile advantage. Low inhibitions to throw away. Used for sorting, allocating and tracking. The card is not everything though. You needed card, conversation and confirmations. The value is in talking about it, not in documenting it... [hmm, I can see a few probs here from my experience]. Story sizes are determined by implementation effort. They should amount to two to five pair days.
Stories enable FLOW... Velocity can be gauged through being able to assessing history of iterations and assessing burn rate. This means this can establish reliability.
Tom advocates that the end users, UI designers, UC designers, Subject matter experts, Testers and analysts, process and product owners define the right stories and the right tests to then feed requirements to the developers... [This is really quite similar an approach to staggering work that I was writing about a week ago or so]
Domain Language... Tool 7
Find the right words. A glossary of domain language will save a lot of pain and arguments between the team.Effective communication depends on a shared language. EVERYONE must have a common understanding of what is what. A domain should be what the software is about. Domain concepts are what the system shows, knows and remembers. Stories, conversations, customer tests, and code should use domain language. The customer already knows it, though they may not always use it precisely. The language must be rich enough to capture business concepts, rules and relationships. Domain concepts will usually make an appearance on the interface too. It should be directly implemented in code... [lovely gem! It would be worth doing research into the language of the audience too I would say. This would make this even more powerful]. You should iterate this language too. It should be defined in the same way as you would plan doing XP... work on breadth and depth at the same time. Language extends to informal UML, digital photos. class diagrams, interaction diagrams, state diagrams. Make them quick and disposable. [oops... we've been too detailed in this area and spent too much time doing beautiful flow diagrams, thus making it set in stone on some level. They didn't facilitate throwing away and starting again. Makes me think we should come up with a velcro-backed kit or something for making dynamic easily reworkable flow diagrams or something... that would be cool!]
Essential Use Cases... Tool 9
yup, he skipped a whole bunch more. Will ask about that later.
What people use the system to do. Business process workflow. A use case is about a business goal. Steps to reach the goal. User intent steps. System responsibility steps. Use cases implement some part of a workflow. Effective use cases are lean. Most write too much. Use conversation and tests to define details. A use case is a users steps to achieve a goal. A story is a unit of developer work. Don't mix them up.
Alistair Coburn - Writing Effective Use Cases. - buy it. Essential use cases are really brief! like REALLY brief enough to fit on a palm card.
Value to use cases should come from the frequency of use. However, make sure that their not subgoals... as in something that must be done to achieve something else. Prioritise according to value... Now he's going very fast through really good stuff and my brain is tired after a long day. I have slides, so will have to try and fill in the gaps later.
Interface Model... Tool 10
Organise tasks, paper prototype, refine prototype, define tests... eaarrrragghhh!! oh dear, brain has ceased to function all together now. :-(
October 21, 2003
ForUSE - Panel Discussion – Between Extreme and Unified
Between Extreme and Unified: Where are the Users and Usability in Development Processes? - Panel. Ivar Jacombson, Jim Heumann, Ron Jeffries, Jeff Patton, Larry Constantine.
Jeff Says... Interaction designers make great Extreme Programming customers. Don't design up front. Quote "You keep using that word [design]... I do not think that means what you think it means..." You can't design without having an integrated, multidisciplinary design approach. We're talking about very Thick design from surface down to data level. One persons design is another's requirements. Do as much design as is necessary to proceed to that next step in development...
Ron Says... All software development methodologies are based on fear... Kent Beck. Fear from customers, fear from management. Programmers using XP are not predisposed to any particular order of doing things. They will however need to reshuffle things if some requirements are bought to an iteration. One in for one out. You will get things on time.
Jim Says... Usability into Rational Unified Process. Users are in the centre of RUP (based on use cases) Actors in a business use case model to define what the value to use cases to certain actors within the business. Usability comes in at the business level and the interface level. Use Cases need to be at the right level so as not to constrain the creative team and not to let them go too wild. Write the right use cases. Write the use cases right. Write the right system. Write the system right.... nnk...
Ivar says... We essentially agree. To be successful in the Software industry, we must raise the level of competence in our teams. Tools are going to get better and better in time as they develop. We need knowledge as an industry captured as best practices. These are not only best practices for designers but also for managers. We need to develop a process for the complete product life cycle.
Processional March question to Jim and Ivar. Give one example of a project that followed RUP correctly but failed? ... no answer
Same Question to Jeff and Ron about XP. "Everything seemed to keep going fine, customers were literate, process was working well and the quality and usability was high. But the client hadn't listened to their users adequately so the product ultimately failed.
What people are best suited for XP? People who will take on any task and focus on getting a good result for the team. The team works best when the team works together. We want people who are good at what they do and not hide behind a process.
What people are best suited for RUP? There are no specific kind of people that are necessary for RUP. RUP is about establishing a common language so that communication can be facilitated. You want people who are also good at what they do. You also want people who are diverse in skills so that they can empathise with other members of the team.
What kind of project product is XP best at? The best kind is a project that has a finite amount of time with mainly low risk as far as human life is concerned.
What kind are ill suited to XP? Organisations that don't hold the values sympathetic to all agile processes cannot make Agile processes work.
What kind of project is RUP best at? RUP wasn't originally designed as a management process. It was designed to help people know what a good way to do 'x' or practice actually looks like. RUP was designed was made through analysing what was common across lots of projects. RUP is a framework of knowledge, not a specific process. You can apply it to web development to military applications. It can be big or small. It's been going on for 25 years. It is designed to be specialised or customised, not a one size fits all.
To the best of your knowledge, what % of RUP adopters actually do it right, instead of just bought the software and been to classes: About 20%
And XP? The number is increasing, but we would guess a smallish percentage.
What if anything, does XP offer to help the overall visual architecture or organisation of the user interface? Nothing. XP is very much about making good code. it's up to the customer to specify the UI.
And RUP? Through creating a user experience model that is derived through the use cases. Representing flows and flow maps to show how screens fit together. There are specialists on a RUP team dedicated to making a good UI.
Can UML be used as a tool to communicate with XP teams? But of course! One of UML's strengths is aiding in collaboration... XP people say that not many people actually know how to use UML properly. Its a very specific language that is very specific. It is a good tool to know, but you shouldn't rely on it as a communication protocol.
In XP, is the role of Customer responsibility a confusion of expertise? Can a customer specify a good user interface? "Well, it's up to the customer". Jeff says, you can't expect a customer to design something if they're not qualified to do so. It comes back to common sense... use your head. Can your customer specify a user interface? If they can't, then perhaps you need to accommodate them with someone who can realise what they want in the form of an interaction designer.
October 20, 2003
ForUSE - Jeff Patton
Usage Centred design in Extreme programming and agile development environments
Agile software development isn't anything new. Books so far (since 1971) have discussed the psychology of teams and programmers within them.
- Dynamic Systems Development Methodology
- Crystal Methodologies
- Feature Driven Development
- Adaptive Software
- Extreme Programming
Born of financial need to make things quicker... meeeting of 17 people at Snowbird, Utah, 2001 formed 'the Agile Alliance'. 4 core principles of the Agile Alliance.
- Individuals and Interactions over Processes and Tools (within the business)
- Working Software over Comprehensive Documentation
- Customer Collaboration over Contract Negotiation
- Responding to change over following a plan.
There are other additional statements are also important and can be found at The Agile Alliance website.
Agile, like UCD is an approach to a method, not a method itself. Releases are composed of Increments which deal with making features. Release Cycle: plan release, feature list, evaluate release. Increment Cycle: Plan increment, determine feature list, evaluate increment. Feature cycle: Design Feature, develop feature, evaluate feature.
Less emphasis on artifacts, up-front design but more on customers and end-user collaboration and emphasis on day to day collaboration within the development team. Incremental improvement resulting in WORKING and USABLE software. Feedback using iterations.
Interesting XP points. Simplicity in design, test driven development, collective ownership, coding standards, System Design Metaphors, Frequent small releases, Customer acceptance testing.
Injecting UCD into XP - Release: Reconcile Roles and Goals with tasks then features. Role and task determine feature priority. Role and Task information drive feature design. Use feature priority and cost to find scope cutting opportunities. Increment: Role and talks information determine bug criticality. Feature: Role and Task information Drive feature design. Test using use cases assuming a user role.
Understanding the Domain. Contextual design work, taking photos of your end users, hear from them and their managers and other stakeholders. All the team needs to be across the end users.
Tactile collaboration tools and techniques - all your typical post-it note jockey stuff. Food, and Kitchen timer is helpful. Means motive and opportunity (as a way to get people involved in collaborative working. Basically, make it easy for them to do so). Make it fun and quick paced.
Accuracy and Detail aren't the same thing. Focus should be on Accuracy to begin with. Detail can come later... [ this really suits my 'Cut but Cut' thoughts ]. A conversation is better than a document. A poster is better than a document (they're like radiators of information - nice metaphor!). Avoid literal UI renderings.
Must get Alistair Cockburn (Humans and Technology inc.) Book.
Use Focal roles and focal task cases to drive priority. Relax standards on the unimportant features. Special attention to quality on quality for focal roles and tasks.
Detailed design comes in where necessary... as in later on. Write essential use cases, build abstract UI prototypes, Render wireframe UI, Validate through testing.
Agile is great because it allows for the mistakes YOU WILL make. The penalties are less because of the iterative nature of the process.
There are problems, but they mainly come from team issues. People not wanting to change or being freaked out by being out of their comfort zone.
Collaboration plan - contract between customer and design/development team.
ForUSE - Designing for Breakthroughs in User Performance – Gennine Strope
More raw notes...
Case study in Usage Centred Design meets Agile in making management software for nurses.
Brainstorm User Roles
Some mistakes made where profiling wasn't thorough enough and goal directed design wasn't thought through well enough. These guys had something like 20 'severity 1' usability flaws still, after 20 sets of user tests. No design lead time, straight into engineering. This is supposed to be selling us on XP and Usage centred design and so far I'm feeling like it's telling us that if you measure success by user comprehension and understanding, this just didn't stack up well. However, she's touting this as having shaved 4-6 hours on training nurses on the old system. But cripes! What must their old system have been like?
Lots of people say small design teams are best... so does she. Basically advocates contextual design. Keep statistics! (good, something new... I was scared there'd be nothing to think about from this) proof of the progress within the business. Return on Investment must be measured for justifying to management.
She's got a 'pocketfull of wisdom!' omigod, that's slightly nauseating... as is the powerpoint sound effects.
Portsmouth, ForUSE, Jeff Patton
Arrived in Boston last night at about 9.30 and made our way somewhat tortuously to the Hotel in Portsmouth from Boston Airport via running for a coach and then catching a cab from the 'Stoppango' to the Hotel where the conference is to be held. Why are hotels all so much the same? This one feels just slightly tired and in need of a facelift. But basically very comfortable... mustn't grumble and all that.
Woke up this morning at 'Sparrow's Fart' which was about 5 am... then again at six because the assholes in the room before me left the alarm set for that time, then again all morning till I got up. Spent the day exploring portsmouth (see gallery below).
The first of the reception gatherings was held in the evening where I met a few people. One of which was Helmut Windl who works for Siemens and is an associate of Constantine & Lockwood who put this conference on. He's been working toward integrating Performance Centred Design and Usage Centred Design. Some of what he's worked toward is a making workflow obvious to the end user through the interface (basic example would be representing the steps in a linear process). He's extended this to forming a hybrid between an object oriented use case and a user task flow to aid in the development process. He also talked about prioritising features against criterias of frequency of use, overall importance (to tasks), and business importance. I was still jet-lagged a bit, so I couldn't really get the most out of what we were discussing. He had more to say. It didn't stick in my head so well.
Helmut did however, pull Jeff Patton over to talk to Lorna Ledden, Andy Scotland and I about what we're here to glean - how to integrate User/Usage Centred Design with agile processes such as XP and Scrum. Jeff has had a lot of experience with this very thing. The first thing he talked about was measuring risk on a project to determine how much design needs to be done before coding should begin. Risk assessment is done through measuring potential revenue loss versus amount of resources that will be building a given product (i.e. potential money lost on final product vs money spent to build it). Risk becomes a function of design lead time.
We also talked about our past experiences with different processes from UCD to XP to SCRUM which was useful in that he had some insights for us about involving all team members but at the right level and time. I think that this confirmed what we knew from experience but still don't have a handle on the fix yet. How do we get that balance right? Jeff suggested he knew of some techniques to get it right, but wasn't able to demonstrate at the time. Might have some supporting info later.
October 16, 2003
Defining priorities for the product requirements backlog
I wouldn't advocate using this in isolation, as I think that you need to filter your requirements through a user benefit vs feasability process first. But, this is very useful for defining which of those get worked on first.
October 13, 2003
On Friday I attended a SCRUM course given by Ken Schwaber, who's brainchild is the SCRUM process (agile software development). A two day course crammed into one was a little hard to absorb, but nonetheless a great opportunity to pick the man's brains about his process.
OK, here are the basic tennets that I had mostly already grasped.
- SCRUM is and Iterative process (inspect and adapt) good at isolating teams from 'seagull management'. Each iteration is about 30 days and no one is allowed to come and feed more requirements or changes to existing ones within that window (known as a sprint). This is regulated by the 'Scrum Master'.
- The team meets every day at a given time and place to discuss three things. Each person reports i) What was achieved yesterday ii) what will be worked on today iii) what obstacles stand in the way of getting this done.
- Requirements fed to team on a basis of 'Best defined, riskiest and business critical first' meaning, those requirements which are known and whose solutions are clearly understood are prioritised before those less defined. Those features which are deemed to have the most risk and or business value (balance of business benefit and user benefit) are also prioritised higher than those which do not.
- Focus on Return on Investment.
- Minimal reporting
Things that I had very much clarified were:
- SCRUM basically has a premise that you can't just hand over a brief or spec, as this flys in the face of the iterative cycle where a multidisciplinary team cracks on with finding solutions to individual requirements rather than defining whole systems (yes, I was worried here... but I'll explain further, it's not that bad). Basically, Ken used the example of a flock of geese which use an empirical process to find what they need when migrating. Their requirements are that they need somewhere that's warm and has enough food and water for the winter. It doesn't matter where they go, so long as these fundamental requirements are met, the geese are happy.
- The other revelation was that SCRUM can be used on anything from planning a party to building software. The premise is a tight knit multidisciplinary team. Meaning we have design, production, tech, editorial all working together at the same time solving about the same set of requirements in a sprint as each other... and talking lots. So, this allayed my fears that SCRUM was all about clients talking to technicians and design is left out of the equasion.
So, here's the tricky part. Alan Cooper staunchly argues that you can't design a good system as a set of answers to requirements. You need to be able to specify an entire system and it's dependencies if you are to succeed in making a well-designed system. Now, I agree mostly with this but am willing to experiment with the rigidity of this assertion purely because it just is not economical having your techs sitting around answering your odd question as you spend months designing a system that you're reasonably sure, but not definately sure will work. Or, would I argue, is it a good way to build happy and thus productive teams.
There will always be a need to do some preliminary research up front when we're talking about user research and defining requirements where you can't involve all team members 100%. Just like as a user experience designer, I can't code lines of code when the time comes to that. So ultimately when we divide a project into Understand, Concept and Design phases, it defines time periods where different skillsets are 'at the fore'.
So whilst in the understand phase, the best thing to do is to employ techs to do research into technical thing. If we see the research as a requirement just like a feature would be we can start to see where a scrum methodology can be employed. The biggest hurdle is convincing programmers that they're not wasting time by being involved in such 'airy fairy nonsense' such as research.
It's always been in the conceptual development phase that things have become hairy or clouded for me. It just never seems right to try and define a system in a slight vaccume (after all, techs can answer our questions, but they too have to estimate the complexity of things from time to time) and the tech team gets (rightly) pissed off waiting for the design team to give them a spec. Here's a thought. What if we were to design in a series of cuts whereby we define a 'sketch' of the overall system first (based of course on the research we carried out in the understand phase), then start to refine it feature by feature once we had some idea what elements would affect other elements. Thereby actually working side by side with techs to constantly flex and change the implementation according to what was achievable or desirable.
So, probably, and I say probably because I haven't tried this yet, it would look something like this.
Basic prototyping has proven to be the most effective form of documentation for communicating functionality to a technical team. A basic interactive wireframe made using Macromedia Flash speaks 1000 times quicker than a UML use-case which arguably takes about as long to write once you've wireframed it and analysed all the permutations of an interaction element. I see this as the way forward, in combination with some UML elements such as flows for specifying the way a system works and how the interface impacts it.
There's so much more to write here... but it's late and my concentration is failing.
September 17, 2003
Extreme Programming vs Interaction Design
Kent Beck is known as the father of "extreme programming," a process created to help developers design and build software that effectively meets user expectations. Alan Cooper is the prime proponent of interaction design, a process with similar goals but different methodology. We brought these two visionaries together to compare philosophies, looking for points of consensus—and points of irreconcilable difference.
Beck says: To me, the shining city on the hill is to create a process that uses XP engineering and the story writing out of interaction design. This could create something that's really far more effective than either of those two things in isolation.
This end is going to be my primary objective for the coming 6 months. I'll be going with some others from The Corporation to the ForUSE conference, where we hope to hear more on how we get to this shining city. In true Rapid Application Developent style We'll probably end up do trial and error, hacking about with our process until it feels right. Will be reporting lots more on progress as it happens.