Talks and Lectures

October 09, 2009

We're still too fluffy

OZ-IA is an information architecture conference held here in Sydney annually. I presented this year on a topic which has occupied me the past few years: Selling user experience design and the value of design thinking to business.

The thrust of the presentation goes like this:

  1. We, as a profession, have largely failed to make great product experiences.
  2. There are certain people that matter in the world of design, and it's not designers. It's the people who pay to have things built
  3. Communicating the value of design to people who pay to have things made needs to be better done by the industry. They call this "Selling" and we can learn it from traditional salesmen.

Posted by Ant at 09:19 PM | Comments (0)

July 19, 2007

UX Week 2007 is coming

Adaptive Path are having their UX Week in Washington DC August 13-16, 2007. It promises to be an interesting set of talks and panels, with a good focus on designing in Agile environments, tools for doing so and naturally plenty of Web 2.0 topics too.

It's testament to the industry's health that the user experience discipline has conferences dedicated to it. The Adaptive Path conferences tend to be more practical than others, which is extraordinarily valuable for maturing the practice. To build an identifiable discipline (such as 'programming' for example) the methods we use must be common. It's vital that all UX folk are conversant in tried and true tools, as much for their own benefit, as for those who are trying to identify what UX work looks like.

Conferences like this also provide an opportunity to showcase new thinking in established areas, or new technologies and methodologies. For my part, I'll be presenting modifications to Jesse James Garrett's Visual Vocabulary – using them to diagram the flow of rich internet applications. Check it out at the conference website. I'll also be sitting on Dan Brown's panel to discuss how design methods and documents are changing with the Web 2.0 era - it promises to be a lively discussion!

Posted by Ant at 08:44 PM | Comments (0)

March 29, 2005

IA Summit 2005 - BJ Fogg

Designing for Impact was the title of BJ Fogg's presentation about how working with technology changes the way our minds work. BJ wrote 'Persuasive Technology' These are the notes that I took from his inspiring keynote lecture.

With the digital age a new phenomenon is emerging where few people who design technology have the power to change the many who use it, quickly. Just as a ballerina's shoe deforms her feet over time, spending hours on end in an email program or browser also changes our minds.

Are we the 'Shaman' of the new age? As the crafter of digital tools we are changing the way people's minds work and culture at the same time. The desire to press "control z" in a car accident, like a ballerina's deformed feet are side effects of this cultural change.

Sometimes we can plan side effects (or just effects). Asking someone to introduce you in a particular way (e.g. "This is Anthony, he is an information architect with expertise in designing social software") has an impact on the course of the following events or discourse.

Users of software should not just 'drift through' an interactive experience. There should be a plan for the outcome of the user's experience. There should be a message behind every interface [Note: This is essentially brand experience design]

The way you design for a one time behavior change, is very different from that for changing behavior over time. People in small towns understand that their relationships with people are likely to be built on multiple interactions. This changes the way they behave compared to those in a big city where interactions are most likely to be 'one time'.

Persuasion as a trend appears to have coincided with the advent of the web. Machines can control human behavior. There are many good things you can do to persuade people, but there are a lot of bad things too.

How can technology persuade you to keep in touch with your family? How can it persuade you to go to the gym? The change in presidential candidate's websites between 2000 and 2004 was big. Employing some of the 60 (approx) theories about how to persuade people. Academics don't agree on the master of these techniques.

Companies embark on impact analysis whereby they graph their aspirations of their users (e.g. sign-up, pay etc) using axes of feasability vs importance (i.e. importance to business success). The top three of these form the primary goals of the company. Persuasion strategies such as praise; persistence; barrier reduction; immediate rewards; pain and fear; social influence; stories (cause and effect); hope (Lottery) are then utilized to see that the goals are met.

The Fire in Captology (what's hot in persuasion right now)

  • Video games - There is a recruiting tool by US army called 'US Army' that persuades people to sign up. Rehearsing behavior in video games influences what players do in real life. Video games are for rewards and are highly compelling, because players can feel their competency growing. Certain demographics get addicted to these games because they get positive feedback about growing competency that is lacking in their lives
  • Automated Behavior Modification (clicker training) is a classical conditioning technique often used to train dogs (click, then feed treat). It is Awkward Behavior Training (not rational). The idea is reinforce everything that is a positive with rewards. Computers can train users like clicker training for dogs and dolphins. When you do something the computer likes, it will reward you. Slot machines do this with periodic rewards in the form of payouts. Very powerful, very scary. Sounds to reinforce: Running water & harps are loved. Horns and alarms are hated. People hate sounds that sound like something bad is happening, like a baby crying.
  • Maxim for Credible Design. To increase the impact of a website, find what elements your target audience interprets most favorably and make those elements most prominent.
  • Companies will map the psychographic profile of users to demographics and will sell this to the highest bidder. They will measure this through observing user behavior.
  • Sequencing strategies work, such as asking for something big (plane ticket) followed by a small request (can you give me a few dollars). The small request will be granted when the larger is refused.

    Who we are as people is expressed through what we create. Methods do matter and we have a social responsibility when creating software.

BJ's Lessons learned
  • Specialize as narrowly as possible. The more you specialize, the broader the impact. This is a law of physics. Focussing on one area gives you more power in that area. Think of three ways you can specialize. In 40 hours, you can be the best in the world at (e.g. passwords of 14yr old japanese girls).
  • Take Risks. Find stories in your life where you overcame something big. These help you when things get tough.
  • Appreciate. Feeling appreciation is a very healthy emotion. Your heart and brain become synchronized. Most spiritual leaders preach this.
  • Rebound. When you fail, bounce. Just get up and keep going. The world keeps going, so you better get up. Walking is controlled falling. You have to fail to learn.
  • Allow yourself to be guided by principles and work with communities to help you make it there. working together we can achieve our goal.

Posted by Ant at 08:42 PM | Comments (1) | TrackBack

December 22, 2003

XPDAY - Mary Poppendieck, Tom Poppendieck - Lean Development - Tuesday 2nd December 2003

Waste is anything that doesn't create value for the customer. If the customer would be as happy without something, then don't do that something.

Value Stream Analysis - [Ant see this entry for a little more on that]

Increase Feedback
For fixing troubled projects, increase feedback at customer, team and management levels. A lack of feedback somewhere in the chain is the most common area for improvement in project debriefs.

Make change inexpensive. You can do this by:

  • delaying commitment
  • sharing partially complete design information
  • developing a sense of how to absorb changes
  • avoiding extra features
  • developing quick response capability
  • developing a sense of when to make decisions

Encapsulate variation
Group what is likely to change together inside one (code) module so that interdependencies are intrinsically linked.

Separate Concerns
A module should have only one responsibility. Don't make them dual purpose.

Avoid Repetition
Don't repeat yourself. Never copy and paste code.

Defer Implementation
Functionality doesn't have to be made till the order is recieved.

Deliver fast - rapidly, reliably, repeatedly

  • Wait till there's a need before you start to build. This is a good incentive or way of 'pulling' your requirements out
  • Don't 'push' with a schedule, you can't keep them up to date. Too much in a project's life changes and its wasted time updating them.
  • Make work self-directing through the use of a visual workplace whereby staff can see what state elements of the project are in
  • Rely on local signalling (like a Daily SCRUM or other methods for communication between team members)
  • Do small batches

Queuing Theory
  1. Steady rate of arrival (short iterations)
  2. Steady rate of service (test features immediately)
  3. Small work packages (integrate features individually)
  4. Reduce utilisation (you don't load servers to 90%... productivity in staff drops off exponentially after around 70-80%)
  5. Eliminate Bottlenecks (Everyone piteches in, when and where ever they are needed... yes this means multiskilled staff)

Lean means introducing a gating mechanism so that full capacity is never reached. Lots of half done stuff IS BAD and is the result of overloading. Just as a highway gets choked up when there's too much traffic, the same happens in a workplace when it is overloaded with work.

Build Integrity In

  • Perceived – totality – overall customer delight
  • Conceptual – the system's components work well together

The way to build both perceived and conceptual integrity is to ensure all aspects of a business are included in the design process. There are 4 questions to answer which cover all aspects.
  • Why are we doing this? This is the business drivers question
    • The Vision
    • Success Model
    • Priorities
    • Capability
  • What needs to be done?
    • User Experience Design
    • Acceptance Tests
    • User Tests
    • Use Cases
  • How do we build it?
    • Programming methods (e.g. XP)
    • Technological platforms
  • How do we support it?
      Ask your support team to be involved in the design process

Empower the team
Kaizen Events are a way to bring a multidisciplinary team together to improve a process or product. The order in which they are done is as follows.

  1. Bring team together
  2. Set the challenge
  3. Brainstorm Solutions
  4. Present recommendations
  5. Decide at a "town meeting" (merely where all stakeholders are consulted)
  6. Implement immediately
To empower the team means that they must have the ability to move autonomously, without fear of reprisal for making decisions.

Posted by Ant at 05:57 PM | Comments (3) | TrackBack

December 10, 2003

XPDAY - Barry Fazackerly - Enterprise XP - Tuesday 2nd December 2003

Enterprise XP is the bastard child of DSDM (Dynamic Systems Development Method) and XP (Extreme Programming). DSDM is another agile process, however, like SCRUM it is more concerned with overall management than it is with programming practices. DSDM however pays fairly rigorous attention and deference to the business case behind a software development project. In this regard it is more of a business analyst or consultant's saviour rather than a programmers. Enterprise XP endeavors to deliver the best of both worlds.

Always ask 'What is the Business case? What is the Return On Investment? What are the measures of success so far as the business is concerned?' A lot of these concepts are captured in The Balanced Scorecard management approach.

The running order of an Enterprise XP project would look something like this

  1. Feasability Study
  2. Business Study
  3. Functional Model iteration
  4. Design and build iteration
  5. Implementation

Other links to interesting things about DSDM
Roles in DSDM
Core DSDM Techniques
Overall process lifecycle

Posted by Ant at 02:09 PM | Comments (0) | TrackBack

XPDAY - Martin Fowler - Keynote - Tuesday 2nd December 2003

Martin Fowler started controversially by announcing that he was a bit sick and tired the Agile world and process methodologies. He used book writing as an example and posed the question: When writing a book, do you do an outline first and then get steadily more detailed? OR Do you just muck in and get busy writing detailed sections and string them all together later? Both methods are 'successful' and merely the personal styles of two different authors. How do you objectively measure the success of the outcome of either of these methods? Isn't it completely subjective? To what do you compare or measure? [Ant - Here I would suggest that perhaps setting your own success criteria is enough providing you do so before you set out to do the work].

Martin is more interested in knowing what makes good software design and how to achieve good software design. SWEBOK (SoftWare Engineers Body of Knowledge) is attempting to define what a software engineer should be knowledgeable about. What would a software university curriculum look like? Analysis, Construction, Design, Testing, etc.

The practice and science of Engineering separates design and construction. Agile, especially XP, does not. Agile advocates doing design and construction together so that each informs the other [Ant - this is only possible because the only cost of building code is man hours and not materials. If physical construction was 'Refactorable' would we engineer and build together too?] Design can be an evolutionary process. UNIX is an example of successful evolutionary design [Ant - although, is the success of UNIX due to the fact that it's free, therefore there is a pretty natural incentive to adopt and grow it or is the success of an evolutionary design that draws people to it?]. Testing, refactoring and continuous integration allows evolutionary design to happen. XP has these three factors that enables evolutionary design to converge.

Only design for the current set of requirements. Future proofing or anticipatory capability isn't so good because extra features and code complicates the system, thus crippling future efforts to adapt and extend the system later. Keeping it simple will enable evolution.

Evolutionary design can work because there are people on the team, willing and able to do it. Teaching design is about showing others how to see problems first. Then over time and apprenticeship, the art of making solutions to the problems can be learned.

You can tell when evolutionary design is happening because the team is motivated, code is being thrown away and the team can freely say 'this isn't working, lets change it'.

Posted by Ant at 12:52 PM | Comments (0) | TrackBack

December 09, 2003

XPDAY - Richard Watt, David Leigh-Fellows - Acceptance Test Driven Development - Monday 1st December 2003

How does a coder know when they are done? The natural tendency for a developer is to continue to perfect a piece of code for long after it has attained an acceptable level. The Unit Test ascertains whether a unit of code works. A Functional Test is used to ascertain whether a group of units are working together to satisfy a desired function. But these are both simply testing that the code is working in a manner that the developer intended. The Acceptance Test asserts the conditions of acceptance set by the customer or designer. They are less concerned with finding imperfections in code as they are with identifying the result of an imperfection. Acceptance tests relate better to requirements or stories than they do to units or objects at code level.

Acceptance tests are a way for the developer to know when they're done. Where automating acceptance tests is achievable, this should be done. A developer can keep working until the testing application gives 'the green light'.

Acceptance tests should be written at the same time as stories are specified. This helps to set the goal posts for and aid in estimating particular functionality for all disciplines involved in the process of designing and building software.

Acceptance tests are hard for many reasons. The customer is involved in defining the test. They are not always adept at structuring such a test framework and will thus need guidance on this. Acceptance tests raise the level of quality assurance support required on the project team. Writing the tests are time consuming. Maintaining the tests is also time consuming and requires careful management and organisation to ensure this is as efficient a process as possible. Setting a common framework for testing can be difficult across multiple work streams. So, acceptance test driven development is not an easy road, however it is worthwhile as it ultimately saves time, eliminates waste and increases quality.

A typical chronology for an acceptance test driven framework would look like this:

  1. Select candidate stories for iteration or development
  2. Write the acceptance test skeleton for each
  3. Sense check the candidate stories
  4. Using the iteration planning workshop as a starting point:
    • A small multidisciplinary group takes a few stories aside
    • The group lists the subtasks associated with each story and then provides an estimate on how long this will take to do
    • The group then presents back the subtasks and estimate to the wider group for critique

  5. Check story dependencies and estimates
  6. Pick highest value group of stories
  7. Agree stories for iteration

And there you have how Acceptance Tests should be integrated into an XP process.

Posted by Ant at 03:35 PM | Comments (0) | TrackBack

December 08, 2003

XPDAY - Sean Hanly, Duncan Pierce - Introduction to XP - Monday 1st December 2003

Values of XP

  • Courage = Get on with it. Be honest about your abilities. If it's not working, change it, refactor it, throw it.
  • Simplicity = Keep all things as simple as they can be. Investment in XP products is incramental.
  • Feedback = Monitor performance, inspect and adapt. Measure early returns.
  • Communication = Focus on info that matters, don't do unnecessary documentation.

Agile is a mindset
Agile is about making incremental investments – small changes to serve longer term goal. Nothing Agile is done in massive increments. So, changing to an agile methodology should be the same. Small incremental steps toward a better solution where each step has been evaluated for success. Change one thing at a time, do things simply, ask 'what already works?'. There is already good in what you do, don't throw it all away. Make a series of small process changes. Using XP is based on these principles - get feedback, measure benefits, inspect, adapt.

When prioritising features, look at which will make for early returns and assess which of those deliver most value to the customer.

When designing and building, see good enough, not perfect (this pertains to the type of feature, rather than the quality of workmanship).

Organic processes work, not just the mechanical and systematic. Order within teams can rise from chaos if the team is empowered to organise themselves.

Don't be afraid to measure progress, make it visible and have courage about honesty around it. Progress indicators are only as valid as the last measurement. Assess the progress over one 'time box' or 'iteration' and then estimate the next based on progress or 'Velocity' of the last.

In XP there are two types of planning sessions. One for each iteration, and one for each release. When using XP you assume that you are not going to release full functionality in the first release, but build it up over a series of releases. The Release planning meeting should have the 'Customer' (This is a term used within XP to denote the sponsor of the project), developers, tester(s) and an interaction designer all present. Each requirement is captured on an index card as an abstract description of something that the system will provide. Each of these estimated in terms of required effort and value. One effort unit is usually equated to a perfect day's effort with no distractions. This unit is then given an arbitrary descriptor (e.g. Gummy Bear) to abstract future indication of velocity away from any expectation of time scale. The value indicator equates to the relative importance of each requirement over the others. Iteration planning involves writing acceptance tests (even if they are not perfect) for each story and ordering the stories according to value and effort. In this case, the team must balance what can be achieved in the iteration with which story has the most value. Where possible, acceptance tests should be automated with development and refactoring carried out only until the unit of code passes its test.

A good story looks like this

  • Independent
  • negotiable
  • valuable to the customer
  • estimable
  • small
  • testable
Measure a story according to whether it increases or protects: Cashflow, Profit or Return on Investment. Each story should have four identifying attributes. Story Name, Acceptance Test(s) name and location, Effort estimate and Value Indicator.

Acceptance tests
Tests for enabling the project team to know when they are done are a crucial and useful to aid in setting targets. There are three levels of test. Screen level tests cannot be automated and involve user testing for comprehension and visual checking by the design team to ensure they are as specified. Interaction tests can be automated and test whether certain use cases work, technically. Engine tests are base level code and can be automated. A tip for writing all tests: make the language and labeling as consistent as possible, as far up as the customer level and as far down to the object (code) level. This will make less a need for documentation and aids communication within the team.

Posted by Ant at 05:29 PM | Comments (0) | TrackBack

XPDAY - Mary Poppendieck - Keynote Monday 1st of December, 2003

Mary's talk mainly focussed on the wider issue of productivity and engendering it within a workforce. She opened with a statement that productivity had a direct relationship to standard of living due to increased profits. Increased profits are derived from increased sales and history has many examples where this is evident.

The first steps to increasing overall profits within an organisation is to focus on the core business practices and work toward being more productive than the competition in these areas. Then work on non-core business practices and match the competition's performance on these.

Being productive does not mean sacrificing quality. You could improve speed and decrease overall quality or value and this would not be increasing productivity. Productivity is about putting the same effort into something and getting more out of it for changing the method in which the work is done. Or being able to charge the same amount for doing less work.

We can be productive by either reducing direct cost (i.e. what a client would pay for) or reducing indirect cost (i.e. streamlining processes and methods). The primary way to reduce direct cost in software development is in building only functionality that is required. Usually 80% of software product functionality is infrequently or not used. Each piece of functionality should have its return on investment measured and then only those yielding highest value should be built. Do the minimum marketable features then release. "Release early, release often" moves profit forward in time thus paying for future releases.

Value Stream Mapping is a good tool for analysing the way in which a business spends its time. It is derived from Japanese manufacturing process analysis. It is basically done by stepping backwards through a process and documenting time taken to do a task and time wasted in between tasks.

Overall, success should be measured not per employee, but the increased productivity of the organisation as a whole. There is a Japanese term 'Keiretsu' which encapsulates the notion of a cooperative group of related companies supporting one another. Productive for a software product can be measured by the increased revenue in the supported business per dollar spent by the IT organisation charged with maintaining it. To help design for this, it is very helpful to have those who will be supporting the product from technological through customer service, to be involved in the design effort.

Other books from which lessons learned in the manufacturing sector can be taken here. More on Mary and Tom Poppendieck's work here

Posted by Ant at 03:04 PM | Comments (0) | TrackBack

October 22, 2003

ForUSE – Debate: Patterns or Process – Constantine and Lockwood

Debate: Patterns or Process? What works for Usage Centred Design? Lucy Lockwood (process) vs Larry Constantine (patterns).

Patterns are a way to capture the wisdom of many years of design experience. They help to expand and extend the awareness of the experienced designer and aid to save the inexperienced designer. Or are they just a poor, inefficient substitute for well implemented process.

For Patterns Larry Constantine
A pattern is an idea that is useful in one practical context and probably be useful in others. Patterns describe repeated problems along with effective solutions in a standardised useful form. Variations may included name, description of problem, applicability of pattern, solutions, consequences results. Some are just Thou Shalts... or shalt nots.

Welie patterns are some well known and often used patterns. They are a bit "Like, duh" though. Patterns can be from ways to display tables with alternating colours in the rows, to the metaphor of a shopping cart. Good patterns through, capture best practices gained through experience. Patterns describe simple and elegant solutions to problems and capture solutions that have developed and evolved over time. hence they aren't the designs people tend to generate initially,. they reflect untold redesign, as developers have struggled design patterns capture these solutions in a succinct and easily applied form. Design Patterns Addison-Wesley, 1995

Best practice patterns should be Non obvious or even counter intuitive. Broad but specific application. Clearly spell out problem to be solved. Clearly articulate solution and tradeoffs. Exemplify best practices. Basically they are the subtleties within the obvious patterns. The should be based on substantial substantiated patterns.

Against Patterns Lucy Lockwood.
UI design is still at the arts and crafts stage. Early civil engineering and architecture was mostly learned by apprenticeship. They copied what they had seen that worked. Very limited number of 3rd degree wizards. The development of engineering principles and processes allowed for expansion of engineering corps. There is no one size fits all yet, because there just isn't the depth of knowledge to be able to say "This worked here in exactly this situation, so it will work here". Patterns still rely on the '3rd degree wizard' to be able to make best practice. Process allows those who aren't '3rd degree wizards' to work within a framework to achieve best practice.

Practicing designers need help recognising problems and poor design... concrete principles, not cognitive psychology. They need direction for improvements, practical guidance, not canned solutions. There is no such thing as "UI standards". Patterns also only cover 10 - 20% of problems. they are often misapplied. The DO NOT support creativity or innovation.

Who Captures and maintains patterns? A company wants designers to design, not waste time capturing and maintaining patterns and they won't pay for a designer to do this. How do we decide what's a good pattern. It's totally subjective. How do we know that the pattern we have is the best pattern that can be trusted? If a pattern written to be timeless it is too abstract and vague. Description is so general that it doesn't offer specific guidance. Patterns are too tied to current technology and fashion or a particular context.

Where do you find patterns? How do you know which pattern you need? They assume that the designer has analysed the problem well enough to choose a suitable pattern. Patterns can compound a problem that wasn't adequately solved in the inception of the pattern.

[Thought: Competitor analysis is really about gathering patterns...]

Five general rules of usability that patterns are trying to achieve....
Access: make the system usable without help of of instruction.
Efficacy: Don't interfere with those who know it already.
Progression: Facilitate knowledge advancement
Support: Support the real work users are trying to accomplish.
Context: Suit the system to conditions and environment.
Basic Usability principles: Visibility, Feedback, Structure (layout dictated by meaning and use), Reuse (use interface components and behaviors consistently), Tolerance (forgive mistakes), Simplicity.

Posted by Ant at 02:39 PM | Comments (0) | TrackBack

ForUSE – The Agile Customer's Toolkit – Tom Poppendieck

Raw Notes...

Most books and publications about XP and agile are very programmer oriented. There's no place for requirements analysis, UI, or interaction design.

Writing effective use cases –

Tools for Customer side practices:
Decision Tools (how to decide what to do)
Role tools (how to organise work and team)
Interface tools.
Story telling and customer tools

Effective collaboration is based on shared
Values Implicit belief system, vision, or mental model about desired business reality or purpose.
Principles – Guiding ideas, insights and rules for deciding.
Practices. – What do Do, Actionable.

Manufacturing metaphors for software development don't work because if you talk to manufacturers and engineers, every time something is built, the process is different and unique. Toyota will stop at every stage of the manufacturing process and test whether it is actually works.

Lean Principles... Tool 1.

Eliminate Waste – Basically, this is just be effective. Weed out the parts of process which are not.
Amplify Learning – This is what iteration is all about. Learning is more important than the code itself.
Decide as late as possible – By waiting, you get more information and feedback
Deliver as fast as possible – if you are going to wait, you have to deliver fast.
Empower the team – agile processes focus on people. Complexity means working together... as effectively as possible. Let the team figure out how to do their job, because they live and breath it and will obviously work out the easiest and most efficient way for them. It will also make them happier.
Build integrity –
See the WHOLE – It is more valuable to measure the effectiveness as a whole, rather than the individuals effectiveness. The team means that everyone is accountable. Rewards should be given to the team, not to individuals within a team.

Concurrent Development... Tool 2

Why are we doing this? What needs to be done? How do we build it? all happen concurrently. Information is done in such a way that you hand over requirements at increasing levels of detail as the project proceeds.

What do we do first? Breadth or Depth? Both. Breadth: Low detail system intent and release and iteration planning. Depth: most important features first. Working app every iteration.

Doing most valuable first is more important than doing most risky... [what is value?]. Build by feature - order by ROI. Defer commitment. Simplicity, feedback, let subsystems and frameworks emerge. Value learning over code. REal customer needs, re-planning and re-factoring are not rework.

Chartering... Tool 4
I don't know why he skipped over tool 3... something about a chair with legs.

Chartering. A team is a community and needs a purpose to exist. Individuals contribute differently. Customers > business goals. Testers > Quality Goals. Interface designer > usage goals. Developers > architecture goals. Analysis > domain goals. A common understanding of purpose is required so all understand the mission. Define success, frame boundaries, facilitate information flow. Align decisions. Mission statement or elevator pitch. Objectives outlining purpose of the project and what it's about. Committed resources. Who defines the success? What is important? What is success? scope? Schedule? defects? resources? You should be able to express all aspects of your mission on one page. If you can't say it concisely, you don't understand it.

XP Practices... Tool 5

Customer Practices: Release and iteration planning. Frequent small releases. Customer tests. Story telling. UI Model. Essential Use Case model...
Developer Practices are the Engine.

Agile Development cycles. Release planning is all about breadth. Iteration planning is a mixture of depth and breadth. Implementation planning is all about depth.

User Stories... Tool 6

Often confused with Use Cases. Stories cover everything the user cares about, both functional and non-functional.Story Card content: Title, One or a few sentences describing what is wanted written by the customer. Developer estimate of relative cost. Sample tests sketched on the back. It's a hand Written index card which has a tactile advantage. Low inhibitions to throw away. Used for sorting, allocating and tracking. The card is not everything though. You needed card, conversation and confirmations. The value is in talking about it, not in documenting it... [hmm, I can see a few probs here from my experience]. Story sizes are determined by implementation effort. They should amount to two to five pair days.

Stories enable FLOW... Velocity can be gauged through being able to assessing history of iterations and assessing burn rate. This means this can establish reliability.

Tom advocates that the end users, UI designers, UC designers, Subject matter experts, Testers and analysts, process and product owners define the right stories and the right tests to then feed requirements to the developers... [This is really quite similar an approach to staggering work that I was writing about a week ago or so]

Domain Language... Tool 7

Find the right words. A glossary of domain language will save a lot of pain and arguments between the team.Effective communication depends on a shared language. EVERYONE must have a common understanding of what is what. A domain should be what the software is about. Domain concepts are what the system shows, knows and remembers. Stories, conversations, customer tests, and code should use domain language. The customer already knows it, though they may not always use it precisely. The language must be rich enough to capture business concepts, rules and relationships. Domain concepts will usually make an appearance on the interface too. It should be directly implemented in code... [lovely gem! It would be worth doing research into the language of the audience too I would say. This would make this even more powerful]. You should iterate this language too. It should be defined in the same way as you would plan doing XP... work on breadth and depth at the same time. Language extends to informal UML, digital photos. class diagrams, interaction diagrams, state diagrams. Make them quick and disposable. [oops... we've been too detailed in this area and spent too much time doing beautiful flow diagrams, thus making it set in stone on some level. They didn't facilitate throwing away and starting again. Makes me think we should come up with a velcro-backed kit or something for making dynamic easily reworkable flow diagrams or something... that would be cool!]

Essential Use Cases... Tool 9
yup, he skipped a whole bunch more. Will ask about that later.

What people use the system to do. Business process workflow. A use case is about a business goal. Steps to reach the goal. User intent steps. System responsibility steps. Use cases implement some part of a workflow. Effective use cases are lean. Most write too much. Use conversation and tests to define details. A use case is a users steps to achieve a goal. A story is a unit of developer work. Don't mix them up.

Alistair Coburn - Writing Effective Use Cases. - buy it. Essential use cases are really brief! like REALLY brief enough to fit on a palm card.

Value to use cases should come from the frequency of use. However, make sure that their not subgoals... as in something that must be done to achieve something else. Prioritise according to value... Now he's going very fast through really good stuff and my brain is tired after a long day. I have slides, so will have to try and fill in the gaps later.

Interface Model... Tool 10

Organise tasks, paper prototype, refine prototype, define tests... eaarrrragghhh!! oh dear, brain has ceased to function all together now. :-(

Tom Poppendieck

Posted by Ant at 12:45 AM | Comments (0) | TrackBack

October 21, 2003

ForUSE – Designing for Performance – Helmut Windl

Designing for Performance - Helmut Windl - Siemens (automations and drives)

... Notes are a little more raw than usual, but the content is quite exciting if anything can be gleaned from these scribblings.

Electronic Performance Support Systems (EPSS) with Usage Centred Design (U-CD) N.B. this is NOT the same as User Centred Design (UCD).

EPSS Workflow and tasks are all visible in the interface. software applications that have an explicit goal for supporting work performance and thinking by people who know neither the work nor the software while accomplishing expert

Performance centred design is driven by user performance, whereas U-CD focuses on usage and improved tools supporting task accomplishment. Performance centred design is still a philosophy. There is no explicit process but utilises lots of other processes to inform design.

To design fro use you need to understand your users, their work, and their needs.
Users > Roles : Separate user actors from system actors then model roles. User actor play in relation to system
Work > Tasks : Identify Tasks needed to support user roles, cluster tasks by use and meaning, define intentions and responsibilities for each
Needs > Tools and Materials : Model UI contents needed to support task clusters. Derive visual and interaction design from models.

Product definition with product framework, user profiles, features and functions to roles to tasks to contents to implementation model. Also Creative design such as aesthetics. U-CD can be used at any size project from an agile one, to a big bloated process.

Performance Centred
Process simplification, reducing complexity and number of steps etc
Performance Information: All necessary information necessary to perform a task is provided in context.
Decision Support. Helps employee depending on work situation and condition to do the next appropriate steeps.

U-CD task model is currently unable to represent sequences of tasks and conditional branches visually in a model.

To redesign and simplify work flow, we have to capture and understand the clients real work process. This includes events leading up to and after the task itself.

K3 modeling techniques: like contextual design for capturing work process. [must find out more - looks good]. Designed to collect and represent real world work processes in an easy to understand model collaboratively with users. K3-Diagrams of the inspected work processes are drawn during field studies and are generalised to universally valid k3 diagrams that directly feed user role and talk modeling. Foltz, Killich, Wolf, 2000

K3 Notation
Activities represent related tasks supporting a high level goal e.g. cut, copy paste... represented by rounded rectangles. Control flows connect activities to indicate the sequence. Sequences are enclosed in start and end state. [This is quite similar to the way I would advocate modeling flowmaps which are heavily influenced by Jesse James Garret's visual vocab]. "Swim Lanes" [This is a really nice enhancement] allocate activities to personas, user roles or organisational units. Decisions by a user role or person are indicated by a diamond... etc

Much of U-CD doesn't enable flows in a visual way. So melding this with K-3 notation, you can get the Task Flow Map. I'm not going to make notes on this because it's too interesting to be distracted... besides which it's visual and my thousand words won't justify these lovely diagrams!

Exploratory Modeling [my thoughts] – gather questions about a task case or user roles through taking a first attempt based on personas and educated guesses. Then validate these through interviews and contextual enquiry.


Posted by Ant at 04:50 PM | Comments (0) | TrackBack

ForUSE - Panel Discussion – Between Extreme and Unified

Between Extreme and Unified: Where are the Users and Usability in Development Processes? - Panel. Ivar Jacombson, Jim Heumann, Ron Jeffries, Jeff Patton, Larry Constantine.

Jeff Says... Interaction designers make great Extreme Programming customers. Don't design up front. Quote "You keep using that word [design]... I do not think that means what you think it means..." You can't design without having an integrated, multidisciplinary design approach. We're talking about very Thick design from surface down to data level. One persons design is another's requirements. Do as much design as is necessary to proceed to that next step in development...

Ron Says... All software development methodologies are based on fear... Kent Beck. Fear from customers, fear from management. Programmers using XP are not predisposed to any particular order of doing things. They will however need to reshuffle things if some requirements are bought to an iteration. One in for one out. You will get things on time.

Jim Says... Usability into Rational Unified Process. Users are in the centre of RUP (based on use cases) Actors in a business use case model to define what the value to use cases to certain actors within the business. Usability comes in at the business level and the interface level. Use Cases need to be at the right level so as not to constrain the creative team and not to let them go too wild. Write the right use cases. Write the use cases right. Write the right system. Write the system right.... nnk...

Ivar says... We essentially agree. To be successful in the Software industry, we must raise the level of competence in our teams. Tools are going to get better and better in time as they develop. We need knowledge as an industry captured as best practices. These are not only best practices for designers but also for managers. We need to develop a process for the complete product life cycle.

Processional March question to Jim and Ivar. Give one example of a project that followed RUP correctly but failed? ... no answer
Same Question to Jeff and Ron about XP. "Everything seemed to keep going fine, customers were literate, process was working well and the quality and usability was high. But the client hadn't listened to their users adequately so the product ultimately failed.

What people are best suited for XP? People who will take on any task and focus on getting a good result for the team. The team works best when the team works together. We want people who are good at what they do and not hide behind a process.

What people are best suited for RUP? There are no specific kind of people that are necessary for RUP. RUP is about establishing a common language so that communication can be facilitated. You want people who are also good at what they do. You also want people who are diverse in skills so that they can empathise with other members of the team.

What kind of project product is XP best at? The best kind is a project that has a finite amount of time with mainly low risk as far as human life is concerned.

What kind are ill suited to XP? Organisations that don't hold the values sympathetic to all agile processes cannot make Agile processes work.

What kind of project is RUP best at? RUP wasn't originally designed as a management process. It was designed to help people know what a good way to do 'x' or practice actually looks like. RUP was designed was made through analysing what was common across lots of projects. RUP is a framework of knowledge, not a specific process. You can apply it to web development to military applications. It can be big or small. It's been going on for 25 years. It is designed to be specialised or customised, not a one size fits all.

To the best of your knowledge, what % of RUP adopters actually do it right, instead of just bought the software and been to classes: About 20%

And XP? The number is increasing, but we would guess a smallish percentage.

What if anything, does XP offer to help the overall visual architecture or organisation of the user interface? Nothing. XP is very much about making good code. it's up to the customer to specify the UI.

And RUP? Through creating a user experience model that is derived through the use cases. Representing flows and flow maps to show how screens fit together. There are specialists on a RUP team dedicated to making a good UI.

Can UML be used as a tool to communicate with XP teams? But of course! One of UML's strengths is aiding in collaboration... XP people say that not many people actually know how to use UML properly. Its a very specific language that is very specific. It is a good tool to know, but you shouldn't rely on it as a communication protocol.

In XP, is the role of Customer responsibility a confusion of expertise? Can a customer specify a good user interface? "Well, it's up to the customer". Jeff says, you can't expect a customer to design something if they're not qualified to do so. It comes back to common sense... use your head. Can your customer specify a user interface? If they can't, then perhaps you need to accommodate them with someone who can realise what they want in the form of an interaction designer.

Posted by Ant at 03:07 PM | Comments (0) | TrackBack

October 20, 2003

ForUSE - Instructive Interaction - Larry Constantine

Instructive Interaction: Innovating without Inundating Users – Larry Constantine.

Raw Notes...

Talking about what users are used to. Based on premise that users can't tell you what they need, but only what they don't like.

Instructive interaction - Help systems as a start. Help systems are often not really very helpful. Finding the help you need takes too long and you lose track of the task at hand. Result is that help is under used and therefore software companies under invest in it because they've figures saying nobody uses it.

Putting help into the interface... or 'instructive interaction' - this is starting to sound a lot like the work we've already done in Single Sign On - where the user interface is self-teaching.

Learning usually requires being told, by being shown or by doing over and over again. Most learning requires repetition and trial and error. Exceptions are 'prepared learning' (single trial learning or Anticipatory learning... the 'oh, I knew that' effect). User encounters a novel or unfamiliar feature, and guesses what it does or how it works, then tries it and is finally rewarded by discovery that they were right. 1) Recognition, 2) Anticipation, 3) Action, 4) Confirmation. [this is kind of like defining 'learnability'] Interface should provide all needed help. Interface has to be 1) explorable 2) intuitable 3) predictable 4) have intrinsic guidance.

Instructive interaction is not about the system trying to do things for the users over the users doing it themselves. It's not about artificial intelligence. It's about making something consistent and learnable like a tool - a hammer. User Agents are also NOT good instructive interaction tools as they're more annoying than useful over time.

Explorable interfaces have no penalties for playing or trying things. Allow users to get out of a situation easily, so that it's forgiving. Consistent, safe cancel and rollback. Infinite level undo and redo is also very good (as in at least 12) even in web applications! (big oops on SSO). Beginners love menus because they allow you to see where you are and how you got there. Nested dialogues are the opposite of this as they obscure views of the previous path.

The user's best guess is probably right [this feels really right to me. I'm always wanting to do lots of 'expectation measuring' within user tests... ask a user what they think will happen by doing a thing, before letting them do a thing]. Windows are like different rooms, you better have a good reason to send a user to another room – Cooper. Contextual help and feedback should be intrinsic to the appearance and behavior of objects, rather than something that's added on to them. It's not so much about MESSAGES, but giving feedback through the behavior of a certain object when manipulated.

Consistency of UI appearance, behavior, and organisation are all important. Behavioral consistency over consistent appearance. Predictability most important of all. Affordances, are as always, very important to learnable interface. Static visual affordances are VERY important, as they're always available to the user even when not actively using. Tooltips, balloon help, starting point highlighted, workflow line (draws the eye from one place to another), etc.

Balloon help - answer 'what is this?' 'what can I do?' 'what should I do?' Cascade screen tips to add more information than just the name of a tool. Link it to the right article in the help section. "Progressive Screen Tips"... make sure you can allow for it to be turned off.

Progressive enabling and disclosure – unobtrusively walk users through a series of actions... wizards don't help skill building because you're still making it into magic rather than teaching the user to do something using the interface.

Anticipatory Action – system tries to guess most common action such as highlighting an option within a dropdown list. Especially for a new concept in a UI you would have the item open/active the first time they come to the interface.

Implicit antecedents... avoid rigid logic or imposed order... don't think like a programmer... skip steps to anticipate what the user really wanted to do (e.g. radio buttons auto select on accompanying value fields being filled out)

Sometimes you need to animate certain interactions in order to have a concept make sense. Sometimes you just can't write out long-hand description of something you can illustrate really clearly with a clear, even stop frame animation.

Sometimes, standard, well established icons and visual elements/interaction idioms can be used in new ways. The key is to test and see whether you've bent the rules too much.

This kind of reminds me of previous lessons in music... "Practice doesn't make perfect... PERFECT practice makes perfect" Learning has to be done right, and anything you can do to aid users in getting it right the first time, the better off your they will be and the quicker they will know your design.

Larry Constantine

Posted by Ant at 09:49 PM | Comments (2) | TrackBack

ForUSE - Jeff Patton

Usage Centred design in Extreme programming and agile development environments

Raw Notes...

Agile software development isn't anything new. Books so far (since 1971) have discussed the psychology of teams and programmers within them.

  • Scrum
  • Peopleware
  • Dynamic Systems Development Methodology
  • Crystal Methodologies
  • Feature Driven Development
  • Adaptive Software
  • Extreme Programming

Born of financial need to make things quicker... meeeting of 17 people at Snowbird, Utah, 2001 formed 'the Agile Alliance'. 4 core principles of the Agile Alliance.

  • Individuals and Interactions over Processes and Tools (within the business)
  • Working Software over Comprehensive Documentation
  • Customer Collaboration over Contract Negotiation
  • Responding to change over following a plan.

There are other additional statements are also important and can be found at The Agile Alliance website.

Agile, like UCD is an approach to a method, not a method itself. Releases are composed of Increments which deal with making features. Release Cycle: plan release, feature list, evaluate release. Increment Cycle: Plan increment, determine feature list, evaluate increment. Feature cycle: Design Feature, develop feature, evaluate feature.

Less emphasis on artifacts, up-front design but more on customers and end-user collaboration and emphasis on day to day collaboration within the development team. Incremental improvement resulting in WORKING and USABLE software. Feedback using iterations.

Interesting XP points. Simplicity in design, test driven development, collective ownership, coding standards, System Design Metaphors, Frequent small releases, Customer acceptance testing.

Injecting UCD into XP - Release: Reconcile Roles and Goals with tasks then features. Role and task determine feature priority. Role and Task information drive feature design. Use feature priority and cost to find scope cutting opportunities. Increment: Role and talks information determine bug criticality. Feature: Role and Task information Drive feature design. Test using use cases assuming a user role.

Understanding the Domain. Contextual design work, taking photos of your end users, hear from them and their managers and other stakeholders. All the team needs to be across the end users.

Tactile collaboration tools and techniques - all your typical post-it note jockey stuff. Food, and Kitchen timer is helpful. Means motive and opportunity (as a way to get people involved in collaborative working. Basically, make it easy for them to do so). Make it fun and quick paced.

Accuracy and Detail aren't the same thing. Focus should be on Accuracy to begin with. Detail can come later... [ this really suits my 'Cut but Cut' thoughts ]. A conversation is better than a document. A poster is better than a document (they're like radiators of information - nice metaphor!). Avoid literal UI renderings.

Must get Alistair Cockburn (Humans and Technology inc.) Book.

Use Focal roles and focal task cases to drive priority. Relax standards on the unimportant features. Special attention to quality on quality for focal roles and tasks.

Detailed design comes in where necessary... as in later on. Write essential use cases, build abstract UI prototypes, Render wireframe UI, Validate through testing.

Agile is great because it allows for the mistakes YOU WILL make. The penalties are less because of the iterative nature of the process.

There are problems, but they mainly come from team issues. People not wanting to change or being freaked out by being out of their comfort zone.

Collaboration plan - contract between customer and design/development team.

Jeff Patten

Posted by Ant at 07:37 PM | Comments (0) | TrackBack

Hey, It's Ralph


Ralph Lord, someone I know from a few lists (UCD and SigIA). He, Andy and I talked about the film industry's 'packet' preparation. How this can be examined for lessons in our process. Particularly the early sketch design phases and ensuring they meet criteria set by business objectives and strategy.

Posted by Ant at 07:01 PM | Comments (0) | TrackBack

ForUSE - Designing for Breakthroughs in User Performance – Gennine Strope

More raw notes...

Case study in Usage Centred Design meets Agile in making management software for nurses.

Brainstorm User Roles
Mapping roles
Brainstorming tasks
Prioritising tasks

Some mistakes made where profiling wasn't thorough enough and goal directed design wasn't thought through well enough. These guys had something like 20 'severity 1' usability flaws still, after 20 sets of user tests. No design lead time, straight into engineering. This is supposed to be selling us on XP and Usage centred design and so far I'm feeling like it's telling us that if you measure success by user comprehension and understanding, this just didn't stack up well. However, she's touting this as having shaved 4-6 hours on training nurses on the old system. But cripes! What must their old system have been like?

Lots of people say small design teams are best... so does she. Basically advocates contextual design. Keep statistics! (good, something new... I was scared there'd be nothing to think about from this) proof of the progress within the business. Return on Investment must be measured for justifying to management.

She's got a 'pocketfull of wisdom!' omigod, that's slightly nauseating... as is the powerpoint sound effects.

Posted by Ant at 04:54 PM | Comments (0) | TrackBack

September 22, 2003

Tim Berners-Lee

Went to a lecture at The Royal Society given by Tim Berners-Lee this evening. He made some interesting points in a frenetic and charasmatic style. The main thing I took away was that his initial vision of the web was bourne of the requirements for a document control and sharing protocol. A founding principle of this was independence from software or hardware platform, network accessibility, application, language, culture, disability... and so on.

"To seperate content from form is good design." was one statement that I've heard before, but rarely has it had as much resonance as when put in context of the lofty goals of what the architects of the web set out to achieve. The standards laid out by WC3 have been a little lost on me in my more ignorant past. Over time I have realised that the standard is our friend and to deny it for aesthetics, economy or perceived flexibility, is short sighted.

Tim Berners-Lee made me realise that web ubiquity will only be achieved through a combination of standards and goal directed design. This is what will lead us to his vision of The Semantic Web. He uses the metaphor of the London Underground tube map to illustrate a web connected by RDF ontologies. Different lines represent different data properties of a relational database (e.g. calendar or event). Where those properties intersect (represented by an interchange on the tube map) with a subject (e.g. time or location), we find a value which is of real usefulness in our everyday lives. At the moment, those properties and subjects do not intersect to create values in terms of the aforementioned application and platform independence. Instead, we have to 'manually' connect them in our heads. In this regard, we are still pre-web (if we go by the measure of what the web set out to achieve).

You should be able to find his whole presentation here. You can also find a less cumbersome explaination some of what I'm rambling about at Paul's blog.

Posted by Ant at 11:48 PM | Comments (1) | TrackBack