Information Design Watch
January 21, 2012, 11:23 am
By Henry Woodbury
Doing a comparative analysis of search functionality, I came across an interesting interactive diagram at the National Archives of Australia. Using simple rollovers the diagram explains the metadata hierarchy used within the Commonwealth Record Series (CRS) System. To see the diagram, start at the Search the Collection page, click “Search as Guest”, then click the “RecordSearch – Advanced search” tab. Here’s a screenshot:
Compare this to the boxes-and-arrows diagram used in the 4700-word CRS Manual.
What gives the interactive chart its punch is the use of verbs to describe the connections between the elements. Verbs like “contain”, “create”, “perform” are contrasted with “are part of”, “are created by”, “are performed by”. These words identify the relationship between subjects and objects in a much more informative way than lines with arrowheads.
September 14, 2011, 3:23 pm
By Henry Woodbury
There are a lot of bills in Congress. IBM Research Labs has created a new way to find them.
IBM Many Bills is a search engine that presents U.S. Congressional legislation in strongly visual format. Each bill is presented in a single vertical column with metadata at the top and sections in descending order. Sections are color coded to delineate their subject. You can show and hide sections of the bills you have found by subject (in a nice accountability feature, a rollover tells you how confident a subject assignment is), save specific bills, and view the actual text.
The color-coded sections allow you to view results in “minified” form, or as an extremely condensed “collection”, such as this group of American Housing Bills:
Many Bills is compelling on several levels. First is the hope that this kind of presentation can help make the legislative process more transparent to both experts and the general public. Second is the project as a model for content-specific search. By understanding the structure of the data, the Many Bills Team presents it in a way that facilitates findability and understanding. There is some risk that the team’s information architecture and design decisions could reinforce conventional thinking at the expense of the unexpected insight, but the source data is available to anyone who wants to try a different approach.
May 23, 2011, 2:02 pm
By Henry Woodbury
The Baseball Hall of Fame’s Uniform Database offers an elegant showcase of the power of small multiples. Here is a simple example:
The database output, by year or team, shows the remarkable variety in baseball uniform design, within the simple confines of cap, jersey, pants, and socks. The outline style shown above was created by Marc Okkonen for his book Baseball Uniforms of the 20th Century which concludes in 1994. Post-1994 slightly more naturalistic — and uglier — images are provided by Major League Baseball Properties.
Sadly, where this online exhibit succeeds as information design it fails as information architecture. The search engine is very clumsy. One cannot compare specific teams or specific years. For example, earlier this season the Boston Red Sox and Chicago Cubs played in throwback 1918 uniforms. There is no way to compare Red Sox / Cubs / 1918 / 2011. For larger searches, one cannot show more than three images in a row, or more than eighteen in a page. Please, BBHOF, publish an API.
April 26, 2011, 9:23 am
By Henry Woodbury
Late on Friday afternoon last week we relaunched DynamicDiagrams.com and this blog. The new site is more scalable than the old and incorporates more ways to present our work. Information Design Watch is incorporated into the main navigation of the site though it still resolves to its own dd.DynamicDiagrams.com subdomain. We like the new look too.
Let us know what you think.
March 9, 2011, 11:44 am
By Lisa Agustin
The Guggenheim Museum recently launched an interactive timeline to accompany its new exhibition, The Great Upheaval: Modern Art from the Guggenheim Collection, 1910-1918. This colorful interactive map and timeline highlights the era’s artists, artist groups, exhibitions, performing arts, publications, artworks, historic events, and cultural movements. Select one of these categories, then scroll across to choose a particular year. Corresponding dots appear on the map above, and clicking on a dot displays a lightbox overlay with more information (see detail above). Overall, the timeline works from linear, drill-down perspective: choose a cultural activity, year, and sample activity within that year. Navigating the “Selected Artworks” category gives users the most detail (as expected), with an image of the artwork, and links to the artist’s biography and to an essay about the artwork, both housed in the pre-existing online collection on guggenheim.org– a nice way to leverage and highlight what’s already available. Discovering these individual nuggets is a little like going on a treasure hunt. The user seeks and finds individual gems scattered throughout.
At the same time, though, this interactive is weak in terms of providing an integrated picture of the era overall. Part of what makes studying an artistic era so exciting is the chance to discover connections: between artistic disciplines, or between the arts and historic events. The timeline misses this opportunity by forcing users to choose only a single category (the checkbox-like bullet next to each category is misleading). Additionally, once you’ve selected a dot on the map, dots of other colors at the bottom of the lightbox (see above) are indictors of simultaneous activities, but these are only visual cues and not links. Investigating these further means selecting a different category for that year and clicking through individual dots to eventually make the connection yourself. Allowing for multiple category selection and including crosslinks to other categories at the lightbox level are straightforward ways to make the pieces of the timeline more tightly integrated, showing that the whole is greater than the sum of its parts.
The Great Upheaval is on display through June 1, 2011.
March 3, 2011, 1:27 pm
By Lisa Agustin
So you’ve just relaunched your redesigned web site or web application. You’ve addressed known user experience problems, met business requirements, and made sure the architecture is one that will accommodate future features, both known and unknown. Now here’s the tricky question: How will you know you’ve improved your user experience?
The broader question of how to measure success is one that we raise with our own clients at the beginning of every project, as this helps us figure out the organization’s priorities and focus. Definitions of success range from trackable statistics (“more users will see the catalog”) to anecdotal assessment (“employees will complain less about using it”).
There is no one-size-fits-all approach to measuring success. Moreover, with the exception of online survey tools like Zoomerang or SurveyMonkey, which can be used assess usability and satisfaction, most tools today are designed to measure success from a business or technical staff’s perspective, rather than the users’. Google’s researchers recognized this problem in assessing their own applications and developed the HEART metrics framework, a method of measuring user experience on a large scale.
The HEART framework is meant to complement what Google calls the “PULSE metrics” framework where PULSE stands for: Page views, Uptime, Latency, Seven-day active users (i.e., number of unique users who used the product at least once in the last week), and Earnings– clearly all stakeholder and/or IT concerns. While these statistics are somewhat related to the user’s experience (which pages get looked at, which items get purchased), these can be problematic in evaluating user interface changes:
[PULSE metrics] may have ambiguous interpretation–for example, a rise in page views for a particular feature may occur because the feature is genuinely popular, or because a confusing interface leads users to get lost in it, clicking around to figure out how to escape. A count of unique users over a given time period, such as seven-day active users, is commonly used as a metric of user experience. It measures overall volume of the user base, but gives no insight into the users’ level of commitment to a product, such as how frequently each of them visited during the seven days.
The HEART metrics framework offers a way to more precisely measure both user attitude and behavior, while providing actionable data for making changes to a product’s user interface. These include the following, which I’ve described very briefly here:
- Happiness. This metric is concerned with measuring the user’s attitude toward the product, including satisfaction, visual appeal and the likelihood that the user will recommend the product to others. The use of a detailed survey as a benchmark and then later as changes are implemented will cover this.
- Engagement. This measures a user’s level of involvement, which will depend on the nature of the product. For example, involvement for a web site may be as simple as visiting it, while involvement for a photo-sharing web application might be the number of photos uploaded within a given period. From a metrics standpoint, involvement can be assessed by looking at frequency of visits or depth of interaction.
- Adoption and Retention. These metrics explore behavior of unique users more in detail, going a step beyond the seven-day active users metric. Adoption metrics track new users starting within a given period (e.g., number of new accounts opened this month), while retention looks at how many of the unique users from the initial period are using the product at a later period.
- Task Success. Successful completion of key tasks is a well-known behavioral metric that relates to efficiency (time to complete at task) and effectiveness (percent of tasks completed). This is commonly tracked on a small-scale through one-on-one usability tests, but can be expanded to web applications by seeing how closely users follow an optimal path to completion (assuming one exists), or by using A/B split or multivariate testing.
But these metrics are not helpful on their own. They must be developed in the context of the Goals of the product or feature, and related Signals that will indicate when the goal has been met. The authors admit that this is perhaps the hardest part of defining success, since different stakeholders may disagree about project goals, requiring a consensus-building exercise.
From my perspective, there is also the additional challenge of clients having both the forethought and resources available to track these metrics in the first place. In many cases, measuring success requires a benchmark or baseline for comparison. Without this in place, the new design itself must serve as a benchmark for any future changes.
February 16, 2011, 2:33 pm
University of Southern Maine Undertakes Re-Design with New Information Architecture by Dynamic Diagrams
By Lisa Agustin
How do you organize a collection of over one hundred, decentrally-managed micro-sites into a single, cohesive entity that offers a consistent user experience from the home page down to the lowest level? This was the key issue facing the University of Southern Maine‘s site redesign, and Dynamic Diagrams was happy to help. The university had plans to migrate the site to a new content management system, and recognized the importance of creating a new architecture to provide both a better experience for site visitors as well as a standardized approach to organizing content for micro-site owners.
After completing a rigorous research and analysis phase that included stakeholder interviews, an inventory of over 5,000 pages (you may have seen the earlier Post-It Note output here), user focus groups, and an online survey, we created a new information architecture (see above) and a set of core wireframes (page schematics) to illustrate the new high-level and page-level user experience, respectively. The new architecture puts the user’s needs front-and-center by presenting all related information together (e.g., degree information that was previously scattered across the course catalog, academic department, and university system database), rather than forcing users to navigate multiple silos of information. The architecture and wireframes will guide the development of the site’s new look and feel, which is now in progress. Look for the new design to be launched later this year.
February 10, 2011, 9:55 am
By Lisa Agustin
Dynamic Diagrams is pleased to announce that the web site for the law firm of Cameron & Mittleman LLP is now live. The two main goals for this project were a refresh to the site’s design, and an easy way to maintain the web site in-house. We provided the information architecture, visual design, and web development services, which included a move to the WordPress platform. Content for launch includes the history of the firm, staff profiles, and practice area information. The extensible solution will enable the organization to add features planned for the future, including a blog. You can view the web site at http://www.cm-law.com/
February 2, 2011, 1:25 pm
By Henry Woodbury
I’ve been reading Jacques Barzun’s magisterial history of western culture, From Dawn to Decadence. His final chapter on the late 20th century is titled “Demotic Life and Times,” “demotic” being a word that means “of the people” even if it happens to sound like “demonic.” Of the internet, Barzun writes:
That a user had “the whole world of knowledge at his disposal” was one of those absurdities like the belief that ultimately computers would think–it will be time to say so when a computer makes an ironic answer. “The whole world of knowledge” could be at one’s disposal only if one already knew a great deal and wanted further information to turn into knowledge after gauging its value.
Information isn’t knowledge. This fact points to a certain friction in the terms we use in our practice. Most often, an information architect really is concerned with information. The goal is to help individuals locate information in a context that helps them gauge its value. An information designer, however, is more focused on knowledge. The designer seeks to communicate ideas within a dataset. I wouldn’t advocate a change in terms. Knowledge designer sounds hopelessly pretentious. But the distinction between the two practices is important.
October 27, 2010, 6:57 am
By Tim Roy
Poetry is emotion put into measure. The emotion must come by nature, but the measure can be acquired by art.
- Thomas Hardy
It has been a week since an update on the redesign of the Dynamic Diagrams website, but work has been progressing steadily behind the scenes. Kirsten, the lead information architect on the project, has worked with the team to develop a solid set of functional and business requirements which have gone through several reviews. With requirements now final, some slight changes have been made to the information architecture of the site itself, although the emphasis remains on the overall design.
Wireframes are also complete and so the work has been turned over to Matt, who is the design lead. It is not an enviable job, designing for a group who spend their days fully focused on all things visual. Matt’s first decision was to use “web fonts“, an emerging standard that allows us to employ our company standard, Meta, without having to use (or maintain) image files. This provides us with a tremendous degree of flexibility while still allowing us to create a consistent look and feel for Dynamic Diagrams.
Matt has produced a first draft of a design style and has received feedback from the working group. This will result in a second version that will be presented to the entire Dynamic Diagrams staff sometime next week. Despite the tough audience he will be facing, Matt can be assured that we will provide him with useful feedback (as opposed to the “I’ll know it when I see it” or “looks good, but can you make it blue?” nightmares that haunt all design professionals). As reported earlier, the biggest change will be in providing a far wider and deeper range of work from our portfolio and Matt and Kirsten seem to have that well in hand.
October 4, 2010, 7:31 pm
By Tim Roy
Dynamic Diagrams has been privileged to collaborate with some of the finest museums in the world including the J. Paul Getty Museum, the National Air and Space Museum, and the United States Holocaust Memorial Museum. While our work has ranged from designing the overall information architecture of a museum’s web presence, to multi-media personal histories, to complex interactive kiosks involving 3D models, it is connected by the unifying thread of our focus on user experience. By considering how a visitor will experience an interaction – be it a web site, kiosk, or video – we can help our clients facilitate the most challenging of communication goals: understanding.
There is little doubt that the “big” museums – the Gettys, the Tates, and the MOMAs – garner a great deal of public attention for their collections and the experiences they create. Yet, there is something special about the “small” museums and what they can teach us.
Sir John Soane’s Museum is one such example. Located in London, it was established in 1806 by the architect Sir John Soane in the interest of providing design and artistic resources for his architectural students. By 1833, the collection had been made public under an act of Parliament and upon Soane’s death, in 1837, was placed under the auspices of a board of trustees and a curator, with the sole intent of making the house and its holdings broadly accessible.
Housing almost 35,000 unique items ranging from Egyptian antiquities to medieval objects to architectural models, Soane assembled his own secret world designed to inspire “Amateurs and Students of the Arts.” In his attention to the smallest and most subtle detail, Soane created meaning for those who cared enough to carefully observe and engage. Stories could be found in a letter’s postmark or in the placement of a single carved button. In many ways, this is an early gesture towards producing an experience for a collections’ users informed by a shared language and common goals.
The museum’s web site was recently redesigned and provides an interesting overview of the collection and some of its hidden details. Still, there is no replacement for actually experiencing the museum in person, even if one must patiently queue for admission. The wait is absolutely worth it.
October 1, 2010, 2:57 pm
By Lisa Agustin
According to a recent article in strategy + business, creating a better shopping experience is really about offering a better choosing experience. More specifically, fewer choices. Offering people lots of options — 31 flavors! (Baskin-Robbins) 87,000 drink combinations! (Starbucks) 27 million books! (Amazon)– sounds like a great idea. But too many choices can have the opposite effect, leading to confusion, anxiety about the “right” choice and ultimately a poor choice or even No Sale. Why? In a nutshell, it comes down to neurological limits on our ability to process information–while the idea of lots of choices sounds exciting (there’s one made “just for me”!), it can be paralyzing to choose from too many options.
The article references studies performed in a grocery store (choosing a jar of jam) and a workplace (choosing a retirement savings plan) to illustrate its point, but it struck me that this applies to online experiences, too. A web site with a lot of content that offers too many options that are poorly organized will lead to frustrated users who will abandon your site for your competitor’s.
So how do we– marketers or user experience (UX) practitioners–craft a better choosing experience? Authors Sheen Iyengar and Kanika Agrawal offer the following tips, to which I’ve added my UX take:
- Cut the number of options.
- Create confidence with expert or personalized recommendations.
- Categorize your offerings so that consumers better understand their options.
- Condition consumers by gradually introducing them to more-complex choices.
Don’t worry about losing shelf space to competitors–in the end, trimming back the product line lowers costs, increases sales, and makes it easier for consumers to choose. According to the authors, “In case the poor performers aren’t evident from sales figures, focus groups and online networks can help you separate the wheat from the chaff.” My UX take: Focus on presenting the content and tasks that mean the most to your users. A combination of web analytics and user feedback (interviews or surveys) will help you figure out what should be on the site and what’s expendable.
Expert reviews and recommendations “let consumers skip over much of the information-processing component of choosing, minimizing cognitive stress and enabling them to make good choices,” according to Iyengar and Agrawal. My UX take: This tip speaks directly to the online experience. Many web sites offering highly differentiated items (books, music, clothes) benefit from recommendation tools, or automated systems (“electronic agents”) that generate suggestions based on consumers’ expressed preferences. While these tools require more of an investment on the part of the organization and sometimes the user (e.g., if a survey or profile needs to be completed), they can be worth it if your web site is one that offers a large quantity of content or inventory to peruse.
“For an expert, there is no completely unique product or service; rather, each offering is a distinctive combination of attributes that the expert has seen before.” The key is getting a novice to act like an expert by creating top-level categories that are easily understood. As an example, the authors cite wine retailer Best Cellars, which limits its varieties to 100 wines that are divided into eight top-level categories, such as “fizzy,” “juicy,” and “sweet.” Once the novice has chosen a category, he or she can choose a wine within that category by reading the detailed labels that accompany all the bottles. My UX take: For web site users that rely on browsing to find what they want, category names are critical. This means avoiding terminology that is either organization-centric (“Initiatives”) or vague (“Solutions”) and using what makes the most sense to users.
“For certain kinds of decisions, you can set up consumers for success by encouraging them to learn from, and build upon, their own previous choices.” Iyengar cites a study in which two groups of car customers were asked to customize their vehicles, choosing everything from the engine to the rearview mirror. The first group started by choosing features with a high number of options, moving to those with low numbers of options. The second group started by making choices for features with a low number of options first. In the end, the first group had a less satisfying experience: “They began by carefully considering every option, but they soon grew tired and settled for the default. In the end, they wound up less satisfied with their cars than the buyers who had progressed from low choice to high choice.” My UX take: Users can go through a lot of information online, provided it’s presented to them in a way that lets them process it in logical bite-sized pieces. This means creating an information architecture that uses categories that make sense to the intended audiences, a hierarchical structure that lets users drill down and expose more information as they need it, and a supporting design that visually prioritizes information on each page.
Iyengar and Agrawal acknowledge the dilemma: “Don’t marketers have to give consumers what they want? Yes and no. We should give them what they really want, not what they say they want…They want to feel confident of their preferences and competent during the choosing process; they want to trust and enjoy their choices, not question them.” The online experience should work the same way.
August 10, 2010, 11:54 am
By Henry Woodbury
How is the Internet Changing the Way You Think?
Is it? That’s up to you. Editor and Publisher John Brockman anticipates the point:
We spent a lot of time going back on forth on “YOU” vs. “WE” and came to the conclusion to go with “YOU”, the reason being that Edge is a conversation. “WE” responses tend to come across like expert papers, public pronouncements, or talks delivered from stage.
In the North Pacific ocean, there were two approaches to boatbuilding. The Aleuts (and their kayak-building relatives) lived on barren, treeless islands and built their vessels by piecing together skeletal frameworks from fragments of beach-combed wood. The Tlingit (and their dugout canoe-building relatives) built their vessels by selecting entire trees out of the rainforest and removing wood until there was nothing left but a canoe.
The Aleut and the Tlingit achieved similar results — maximum boat / minimum material — by opposite means. The flood of information unleashed by the Internet has produced a similar cultural split. We used to be kayak builders, collecting all available fragments of information to assemble the framework that kept us afloat. Now, we have to learn to become dugout-canoe builders, discarding unneccessary information to reveal the shape of knowledge hidden within.
Give us a tree and we’ll carve your canoe. That is what Tim Roy is talking about.
(via Andrew Gilmartin who linked to Dyson’s quote on Facebook. Andrew blogs here.)
Update: I rewrote my lede, up to the Dyson quote, to add context and incorporate Brockman’s “you” vs. “we” statement.
July 29, 2010, 12:26 pm
By Kirsten Robinson
The Portsmouth Herald has published an article about Historic New England’s new web site and online collections project, for which Dynamic Diagrams provided web strategy, information architecture and design services, as well as project management for the site’s development.
You can view the web site at www.historicnewengland.org or dive right into searching and browsing the online collections — full of photos, artifacts, and reference materials having to do with 400 years of New England History.
We’re currently in the final stage of the project, conducting usability tests on the new site.
May 27, 2010, 2:15 pm
By Kirsten Robinson
Historic New England’s redesigned web site is now live at www.historicnewengland.org. Historic New England is a non-profit organization dedicated to preserving and presenting New England’s history. They own and operate 36 historic house museums, provide educational programming for adults and children, collect and conserve historic objects and archives, help preservation organizations and homeowners protect and maintain historic sites, and publish books and magazines about history and preservation.
Some highlights of the new site:
- Improved navigation and fresh visual design replaced a site that had grown organically over ten years.
- Greatly expanded content on historic properties, preservation, and more: site updates are completely under the control of Historic New England staff for the first time, through an easy-to-use content management system (CMS) called Plone.
- Online collections access: users can now browse and search Historic New England’s extensive collections of museum objects, archival materials, and books. Online exhibitions are also easier to create.
- Interactive events calendar allows users to browse events by date and location and then click through to the online shop for registration.
- Search engine provides quick access to site content and collection highlights from any page, and there are also specialized searches for collections and events.
- Galleries and slide shows are available throughout the site to better present Historic New England’s great photography. Here’s one about the animals at Spencer-Peirce-Little Farm.
- Multimedia is also supported, as seen in the Berlin & Coos County oral history project.
- Interactive map provides a visual overview of Historic New England’s 36 property locations.
- Integration with Historic New England’s online shop (developed by a third party) enables them to sell memberships, donations, event registrations, and merchandise. The shop integration will also enable single sign on between the site and the shop, allowing access to restricted content as well as member discounts on purchases.
- News has categories and feeds to position news appropriately throughout the site, and allows user commenting.
- Microsites enable visitors to rent properties for weddings and functions and to celebrate Historic New England’s centennial.
Dynamic Diagrams has been working with Historic New England since January 2009 to define web strategy, information architecture, user experience, and visual design for the site. We worked with our development partners to implement the site using the Plone CMS, to convert legacy content, and to integrate the site visually and functionally with Historic New England’s online shop. We collaborated with our partners and Historic New England’s collections team to define and develop the Collections Access portal. Finally, we and our partners trained Historic New England staff authors on Plone and writing for the web, so that they could develop new content for the site and maintain it going forward.
We are thrilled to see the site go live and congratulate Historic New England on a successful launch.
February 5, 2010, 2:07 pm
By Lisa Agustin
Rattle offers its first blog post on developing the user experience strategy for “A History of the World,” the companion web site for the BBC Radio 4 series of the same name. Written and narrated by Neil MacGregor, Director of the British Museum, the radio program travels through two million years to tell the history of humanity through 100 handmade objects from the Museum, ranging from a stone chopping tool to the cell phone. The web site enables exploration of these objects in detail, but also gives users the opportunity to participate by commenting on the collection or uploading images from their own personal collections. Rattle’s initial post walks us through general principles from their brief (e.g., “some use of participatory media”), the resulting strategic goals (e.g., “focus on attracting, rewarding, and promoting a small minority of contributors”), and its initial brainstorm of features (e.g., “select 10 objects to represent the History of Me”). Now that the site is live, it will be interesting to read future installments to see how these initial high-level goals and blue-sky thinking compare to what was actually developed.
January 12, 2010, 11:13 am
By Kirsten Robinson
Historic New England has launched a Centennial microsite to celebrate their 100th year of preserving New England’s history and to highlight centennial projects that they are creating in conjunction with community partners throughout the New England states. Key site features include an events calendar, photo galleries and slide shows, and video oral histories.
Historic New England selected Dynamic Diagrams to create the user experience for the site (research, information architecture, visual design, and XHTML and CSS coding). We worked with our development partners to implement a Plone content management system (CMS) that provides Historic New England — for the first time — with complete control to create their own pages.
The Centennial site is also a preview of things to come. Watch this space for a future announcement of Historic New England’s redesigned and enhanced main web site.
October 9, 2008, 11:13 am
By Lisa Agustin
The World Health Organization’s Special Programme for Research and Training in Tropical Diseases (TDR) unveiled its new corporate web site this week.
Although the existing site had much to offer, users had difficulty finding the information they needed (namely grant opportunties and TDR research publications), and the client felt that TDR’s contributions were buried. The redesigned site features a new information architecture that makes key content easier to find, while highlighting TDR’s accomplishments and new business strategy.
September 2, 2008, 12:20 pm
By Lisa Agustin
I’ve been following with some interest UIE’s series on what it considers web design “cop-outs,” such as site maps. According to Jared Spool, a good information architecture should eliminate the need for a site map, since the map itself doesn’t “give off scent,” or clues for finding desired content:
“It’s only in the absence of anything else that gives off scent that users start to think it’s a likely help. Therefore, the real problem is the pages that lead to the site map are missing important scent. Fixing the scent issues on those pages will eliminate the need for the site map. However, deciding to improve the site map doesn’t fix the scent problem — it’s only a cop-out.”
I do agree that redesigning a site map is not the way to address findability issues, but it’s a drastic move to get rid of the site map altogether, even if there are only a few people that use them. We look at the site map not as a back-up option for locating content, but rather as the single-page view of what the whole web site offers. When done well (ideally as a single static page of links that doesn’t go deeper than 2-3 levels in the hierarchy), the site map is not a crutch, but a complementary navigation tool.
October 4, 2007, 12:31 pm
By Henry Woodbury
Back in 2002, we designed a Modeling Access Control Poster for that year’s ASIS&T Information Architecture Summit. We intentionally challenged ourselves to explain web-based access control systems on a conceptual level, rather than show a particular case.
This approach now helps us, internally, to define the appropriate requirements-gathering baseline for a newly conceived system.
The printable poster is here: Modeling Access Control Poster (PDF, 287K).
July 20, 2007, 10:37 am
By Lisa Agustin
Rich Internet Applications (RIAs) enable a user experience that’s more responsive and sophisticated than traditional HTML. But does crafting the RIA experience differ that much from architecting a traditional web site? Yes and no, says Adam Polansky in the latest ASIS&t Bulletin. Polansky, an information architect for an online travel company, was tasked with producing a trip planning application that had originally taken shape as an exciting proof-of-concept Flash demo, but which had not been scrutinized in terms of scalability, usability, or actual user needs.
Before moving forward, Polansky took a few steps back by employing traditional IA exercises such as wireframing (adapted to a more interactive experience) and usability testing to validate the direction and identify the holes. Besides pointing out the similarities and differences between building web sites and RIAs, he offers a good shortlist of pitfalls to avoid, including the potential for increased revision cycles and building interaction at the expense of content. I would tend to agree with him on both fronts. In our practice, we’ve found that constructing process flows and annotated wireframes are key to keeping everyone on the same page about the intended user experience and the possible trade-offs between vision and feasibility. These activities ease (if not eliminate) any worry of creating interaction for its own sake.
June 19, 2007, 10:56 am
By Lisa Agustin
Earlier this month, Fastcompany.com plugged the agile development approach that was used to redesign its home page. The approach in a nutshell, according to blogger Ed Sussman: “Vision, release, test, iterate. Repeat. Quickly.” Speaking metaphorically, think of design and development as a washing machine, not a waterfall. The organization initially planned to release the new design as part of a larger effort that encompassed new features and functionality. But in the end, they decided against it:
What if we had waited to get it all just right before we released FC Expert Bloggers? We’d still be in the dugout. We’d have been guessing instead of seeing what the market actually thinks. In an effort to make our product perfect, we probably would have been forced to spend loads of money fixing problems that might not have mattered to our readers.
The agile approach is one that certainly has its benefits — it’s flexible and means users get to see the latest features sooner, without waiting for an annual update. But in order to be successful, an agile approach still has to start with stakeholder and user requirements that are validated through an information architecture, design, and development process. Only then can an organization be sure its site’s “killer widgets” are truly meeting the needs of its audience.
May 10, 2007, 8:32 pm
By Mac McBurney
Our collaboration with the University of St Andrews was an important reminder about organizations and their information: Good information architecture is necessary, but it is not sufficient. Copious, heterogeneous, complex information tends to come from organizations of similar description, so improving the web site–especially the public web site–means getting intimate with the culture and politics of the organization.
Luckily, our colleagues at St Andrews understood their new information architecture as a process, not an event. They involved people from across the university, took the time to understand the reasons behind our recommendations, and called on us to help educate stakeholders about our plans. The project was part town meeting, part information architecture crash course, not to mention figuring out where to put all those web pages.
Structurally, the relaunched web site is a radical departure from the old. (The 404 error page gives a hint.) Previously, the university’s many administrative offices had each looked after their own presence on the web, and the site became–for good and understandable reasons–a daunting, overgrown web-site-as-org-chart. The new information architecture makes two important changes. First, the site represents the character and vitality of the institution as a whole, not just the individual parts. Second, no prior knowledge of the university’s bureaucracy is required. Content is organized according to its audience, not its author. The home page and its subsidiaries are tailored for outside audiences and infrequent visitors. Alternate home pages, a completely separate system of categories, and different navigation and interface designs are provided for current students and staff.
To see photos of a sunny day in Scotland and read about our presentation last June (and other tales) from someone on the client’s side, check out Gareth Saunders’ personal blog, View from the Potting Shed.
August 3, 2006, 10:23 am
By Henry Woodbury
Kevin Hale at Particletree has a pair of articles on prototyping and wireframing AJAX applications. These are excellent primers for web designers interested in working with AJAX developers. I learned some CSS details that will help me out. However, I think Hale misses one key point of prototyping and wireframing almost entirely.
That point is risk management. When a design is still under review, when process steps are still being determined, box wireframes and bitmap design comps allow an information architect and designer to develop ideas quickly and revise or even abandon them with minimal pain. At Dynamic Diagrams we prefer to avoid coding designs until we have agreement on all the major elements of the interface. If we can do at least some usability testing with bitmaps, that’s even better.
But what of the demands of the interactive AJAX-driven interfaces that Hale describes? One risk-free way to show AJAX interactivity is to present work you’ve already done, perhaps from your own development library. Once stakeholders agree about the types of interactivity they want, an actual interface can almost always be modeled as a sequence of static wireframes, convertible to static bitmaps: Here are the elements at point x; Here they are at point y. More useful than a working prototype at this stage may be a workflow diagram that shows an entire sequence of steps in one view:
This diagram shows the interaction of different user types with a help ticket, describes when that object changes status (open, under review, closed), and identifies which pages in the process have multiple functions — crucial information for an AJAX developer to understand.
All this said, there (usually) comes a time when a project advances beyond wireframes and design comps to coding and development. At this point, for the web designers working with AJAX developers, Hale’s advice makes a lot of sense.
July 21, 2006, 4:04 pm
By Henry Woodbury
It’s like the early days of Web design, but more so. This Design Interact article describes how Yahoo planned and delivered its mobile device site for the 2006 World Cup. The goal was to make a site that could work on as many browser-enabled phones as possible. The problem was the baffling idiosyncrasies of those devices:
“The Web browsers on phones vary from basic to super basic,” explains Keith Saft, senior interaction designer at Yahoo! Mobile. “They also have these eccentric bits of HTML and CSS that they don‘t support, and there aren‘t really any standards or consistency across phones.“
As they catalogued the technical limitations of mobile browsers, the Yahoo team created a design strategy that prioritized usability:
With production also came usability testing. And here, surprisingly enough, the team did not try to achieve perfect layout and content consistency on every phone. Instead, it wanted to make sure that users understood something it called “design intent.“
Do users navigate efficiently through the site? Do they understand how items are grouped on a screen? Can they retrieve the information they want? “Design intent” is design by information architecture.
June 19, 2006, 1:01 pm
By Lisa Agustin
Consumers? Customers? Users? These are the words that we’ve grown accustomed to using when referring to the person who will benefit from the latest object or information we’ve designed. According to author Don Norman, these are derogatory terms that continue to look at the (pardon us) end user from the company-centered perspective. Says Norman, why not look at them for what they really are — People:
If we are designing for people, why not call them that: people, a person, or perhaps humans. But no, we distance ourselves from the people for whom we design by giving them descriptive and somewhat degrading names, such as customer, consumer, or user. Customer — you know, someone who pays the bills. Consumer — one who consumes. User, or even worse, end user — the person who pushes the buttons, clicks the mouse, and keeps getting confused.
The related terminology is no less impersonal, since these users–in their various “roles”–”perform tasks” to get “results” and hopefully avoid “errors” in the process. Norman’s suggestion to “wipe out words such as consumer, customer, and user from our vocabulary” may not be possible; rather, the takeaway for designers should be to strive for an understanding of and empathy for those who will hopefully benefit from well-designed information or products.
April 14, 2006, 9:13 am
We have expanded the page to take advantage of the larger monitors now used by the vast majority of our readers. We’ve improved the navigation throughout the site so that no matter what page you land on, you can easily dig deeper into other sections or use our multimedia.
There are few off-key notes. Articles in the “Most Popular: Blogged” list don’t have trackbacks to actual blogs. Times Select still requires pay-per-view for most opinion columns. But the Times’ biggest worry should be too many people agreeing with media critic Jack Schafer:
“Hello, New York Times? I’d like to cancel my subscription today….I’m canceling because the redesign of your Web site, which you unveiled yesterday, bests the print edition by such a margin I’ve decided to pocket the annual $621.40 I currently spend on home delivery.”
March 17, 2006, 9:22 am
The Design Council, a UK organization that advocates good design, provides a lot of cross-disciplinary information on its sprawling Web site. If you need to reaquaint yourself with the meaning of information design or explain it to colleagues, take a look at Sue Walker and Mark Barratt’s About: Information Design article. For example, here the two explain the relationship between information design and information architecture:
“Information designers order and structure information on behalf of users. The information architect — a new role created by large websites and information systems — is part information designer, part information scientist, part information systems professional. They create the order, taxonomies and navigation interfaces that allow us to use today’s million-page websites efficiently.”
With sections like “Why it matters to business,” “Why it matters to public,” “Examples,” “Facts and Quotes,” and so forth, the article offers a number of entry points for different audiences.
Our own white paper Information Architecture for Web Sites can be found here:
February 10, 2006, 9:58 am
On A List Apart, designer Derek Powazek offers some guidelines for home page design. Along with good, concrete suggestions on home page form and function, Powazek leads off with an information architecture gem — create the home page last:
“When I set out to design a website, I do it backwards. I start with the design of the smallest, deepest element: the story page or search results. Then I work backwards to design their containers: section pages, indexes. Then, lastly, I work on the home page. I do this because each container needs to adequately set expectations for what it contains. If the home page says one thing, but the internal pages say another, that’s going to lead to a user-experience failure.”
Determining the nature of the deepest element in a site is vital for creating scalable sites with consistent branding and navigation systems that help users find something once, then find it again.
December 9, 2005, 10:26 am
In an article on A List Apart, Nick Usborne comes close to defining information architecture without ever using the term. Addressing Web designers, he asks:
Now, just pause for a moment and think of all the design choices you have made over the last year, and the reasons why you made them. And think about the huge impact those choices might have had on the performance of the sites you worked on.
Usborne presents a scarily simple usability test to demonstrate his thesis. And counsels designers to act like information architects — that is, to talk to content owners and, even better, to actual users.
And Who Does Not?
Scott Jason Cohen presents the alternative view:
[T]o many, the information architect seems redundant. If the project involves heavy back-end implementation, the system and user flow will already be determined. Click here, go there – the tech people will have already figured this out. In terms of layout, a good visual designer will know not to make a page too damn cluttered. (It’s usually the client that insists on putting 3,000 links on the front page or making the logo spin.)
Cohen’s manifesto is entertaining but misses the mark in some fundamental ways. First, the “tech people” have not already figured everything out. Good back-end technologies are designed for customization; analyzing and diagramming process flows is an important part of our information architecture practice. Second, information architecture is creative. The challenge of organizing complex information for use by different types of users is never solved by prepackaged rules and good information architects know this.
Cohen characterizes information architects as outsiders who interfere with the design process. In fact, information architects and visual designers face the same challenges and work toward the same goals; the best designs come from collaborative practice.
September 19, 2005, 12:25 pm
The Mindset Yahoo search (beta) adds a simple way to filter search results. Once you run a search, an interactive “slider” allows you to focus results more on “shopping” or “researching”. This binary choice actually turns out to be quite effective at sorting typical results. It does raise the interesting question about whether other such pairs of opposites could be as useful in particular contexts.
For a completely different use of dynamic sliders, see Amazon.com’s Diamond Search. The fact that diamonds are graded by very specific qualities results in an excellent example of a faceted search:
July 11, 2005, 12:44 pm
According to Jakob Nielsen, Internet users have developed pretty firm ideas about what “Search” is and how it works:
“In our experience, when users see a fat ‘Search’ button, they’re likely to frantically look for ‘the box where I type my words.’ The mental model is so strong that the label ‘Search’ equals keyword searching, not other types of search.”
What this means is that alternate search methodologies (parametric searches, for example) need to be presented within a strong supporting context, including, perhaps, the complete avoidance of the word “Search.”
An earlier article by Nielsen on search usability reinforces his current findings (and is worth reading in full for other guidelines):
“Users often move fast and furiously when they’re looking for search. As we’ve seen in recent studies, they typically scan the home page looking for ‘the little box where I can type’.”
March 11, 2005, 10:03 am
Studies in the Journal of the American Medical Association report that treatment tracking software may be problematic for patients. This summary in the New York Times only vaguely distinguishes between data standards, software development, and usability, but clearly some of the reported problems relate to information architecture:
“To find a single patient’s medications, the researchers found, a doctor might have to browse through up to 20 screens of information….Among the potential causes of errors they listed were patient names’ being grouped together confusingly in tiny print, drug dosages that seem arbitrary and computer crashes.”
http://www.nytimes.com/2005/03/09/technology/09compute.html?incamp=article_popular_5 (free registration required)
The first of the JAMA studies, “Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors,” is offered for free on the association’s Web site:
February 11, 2005, 2:05 pm
Many Internet sites and services are starting to provide content tagging tools to their users. Content authors or visitors can label individual content items (articles, pages) with their own terms. In aggregate, these overlapping terms create a layer of metadata that site technologists can use to inform content links, generate better search results, and build browsable indices.
“‘To me, they’re a great new organization tool for applications and large content sites,’ said Matt Haughey, the founder of MetaFilter. ‘Tags are great because you throw caution to the wind, forget about whittling down everything into a distinct set of categories and instead let folks loose categorizing their own stuff on their own terms.’”
December 8, 2004, 2:44 pm
Google’s new “Scholar Google” (http://scholar.google.com/) is a public search engine specifically targeted to scholarly information. Of interest are the implications of Google’s typically terse recommendations for submitting and accessing different kinds of content. Regarding abstracts, for example, Google requires open access:
“Regardless of the source, you should be able to see an abstract for any article, with the exception of those that are offline and referenced in citations only. Please let us know if you don’t see even an abstract.”
These are issues we’ve encountered many times in our work for university publishers and professional associations. Google’s recommendations are likely to start turning good practices into industry standards.
Google Scholar is generating a lot of interest online; here are two reports:
November 10, 2004, 2:57 pm
Information Architect Peter Boersma argues that “big” information architecture, the large-scale integration of specialized IA tasks such as navigational design and metadata analysis with related specialties such as visual design and copywriting, should inherit the term “User Experience.”
With the aid of several condensed cocktail-napkin sketches, Boersma gets beyond the terminology debate and offers a useful way of understanding the scope of big information-based projects.
Boersma introduces his essay with a reference to Peter Morville’s “Big Architect Little Architect” essay, found here:
September 16, 2004, 3:29 pm
Poynter Institute has posted the results of “Eyetrack III,” a study on how people look at news online. While the study is “wide, not deep,” it contains many interesting points that could contribute to the analysis of any content-based Web site. For example:
“Photographs, contrary to what you might expect (and contrary to findings of 1990 Poynter eyetracking research on print newspapers), aren’t typically the entry point to a homepage. Text rules on the PC screen — both in order viewed and in overall time spent looking at it.”
August 12, 2004, 3:35 pm
Edmunds.com’s user forums contain more than 2.5 million messages and 100,000 car reviews. To extract meaning from this wealth of commentary, questions, and answers, Edmunds.com is analyzing the data with Attensity Corporation’s PowerDrill software. PowerDrill is notable for its use of sentence diagramming to identify the actors, actions and objects in unstructured text:
“[In beta tests] Edmunds.com was able to analyze trend information from conversations on the forums, including shopping and dealer behavior, re-occurring issues, and concerns which can also be used to predict future behavior.”
Tools such as PowerDrill that turn free form text into relational data may cause Web developers to take a new look at how they utilize forums, feedback forms, Web logs and free-form content spaces.
June 17, 2004, 9:53 am
Web site information architecture methodologies are generally focused on usability — on how users can best find the information they seek. A related issue of significant importance to businesses and content creators is how users determine the credibility of the information they find.
A landmark study in this field is Stanford Persuasive Technology Lab’s 2002 report for ConsumerWebWatch, How Do People Evaluate a Web Site’s Credibility? Results from a Large Study. This report points out many parallels between Web usability and Web credibility, with an unexpectedly strong plug for visual design:
“the average consumer paid far more attention to the superficial aspects of a site, such as visual cues, than to its content. For example, nearly half of all consumers (or 46.1%) in the study assessed the credibility of sites based in part on the appeal of the overall visual design of a site, including layout, typography, font size and color schemes.”
What was the next most important factor? Information Design and Structure. The authors suggest:
“One might speculate that by providing a clear information structure, a Web design team demonstrates expertise to the users. Users may then assume this expertise extends to the quality of information on the site.”
February 3, 2004, 9:28 am
Presenting educational content online demands a collaborative, flexible approach. Institutions need standards that allow them to share learning tools. Individual instructors need a means to create customized content. The Open Knowledge Initiative, a proposed solution to this problem, started with a long look at fundamentals:
“For most of the project’s first year, key developers from each of the collaborating schools met at MIT to hammer out a list of the basic functions that an educational management system would need.” [Our emphasis.]
Even when specifications focus on programming, user interaction is explicit in many functions: authentication, file sharing, scheduling, messaging, etc. Software standards imply an information architecture, whether planned or not; a comprehensive information architecture helps ensure that a system is both portable and scalable.
http://www.technologyreview.com/articles/atwood1203.asp?trk=nl (free registration required)