Information Design Watch

March 14, 2012, 1:29 pm

Eulogy for a Dinosaur

By Henry Woodbury

The venerable Encyclopedia Britannica has ceased print publication:

In an acknowledgment of the realities of the digital age — and of competition from the Web site Wikipedia — Encyclopaedia Britannica will focus primarily on its online encyclopedias and educational curriculum for schools. The last print version is the 32-volume 2010 edition, which weighs 129 pounds and includes new entries on global warming and the Human Genome Project.

Given the well-known collaborative editing model of Wikipedia, Gary Marchionini, the dean of the School of Information and Library Science at the University of North Carolina at Chapel Hill, makes an interesting claim about not only the competitive difference, but about the nature of knowledge itself:

The thing that you get from an encyclopedia is one of the best scholars in the world writing a description of that phenomenon or that object, but you’re still getting just one point of view. Anything worth discussing in life is worth getting more than one point of view.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (4) | Filed under: Business, Scholarly Publishing, Technology

March 7, 2012, 3:42 pm

The Scientists Sketch

By Henry Woodbury

Data visualization consultant Lee De Cola has assembled a neat cross section of sketches by famous scientists. Here, for example, is a literal back-of-the-envelope sketch by Henri Poincaré:

Henri Poincaré's back-of-the-envelope calculations

Sadly, many of the images are small, or culled of context. Consider them a teaser. Galileo’s sketch of Saturn is a minor doodle compared to the visual storytelling found in this page from his notebook on Jupiter:

Moons of Jupiter, from Galileo's Notebook

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (2) | Filed under: Art, Charts and Graphs, Diagrams, Illustration, Information Design, Maps, Scholarly Publishing, Visual Explanation

January 13, 2012, 12:59 pm

The Cost of Research

By Henry Woodbury

As the rumble between intellectual property and free speech advances into the ring drawn by SOPA (the Stop Online Piracy Act), Michael B. Eisen draws attention to a fight on the undercard. Eisen, professor of molecular and cell biology, critiques The Research Works Act which, in his words:

…would forbid the N.I.H. [National Institutes of Health] to require, as it now does, that its grantees provide copies of the papers they publish in peer-reviewed journals to the library. If the bill passes, to read the results of federally funded research, most Americans would have to buy access to individual articles at a cost of $15 or $30 apiece. In other words, taxpayers who already paid for the research would have to pay again to read the results.

Supporters of the bill include many traditional publishers of medical research (ironically, one of its sponsors, Darrell Issa, Republican of California, is one of SOPA’s most prominent opponents).

Dynamic Diagrams has a long history of working with scientific publishers going back over 15 years. We worked with major journals like Nature and JAMA to bring them fully online; we’ve also worked with research aggregators such as HighWire and Publishing Technology. We’re well aware of the technology and information management demands required just for online presentation, let alone the physical and specialist costs of creating a print publication. Now consider the editorial investment required to guide content to a publishable state (even if, as Eisen points out, peer review is provided voluntarily, often by researchers at publicly-funded institutions). Just for example, at a tactical level, most journals require an access-controlled transactional web space for authors and editors to exchange drafts.

This is not to take sides in the argument, but to draw attention to the real costs associated with managing and presenting electronic information. These should not be disregarded. At Scientific American, the comments section to Michelle Clement’s call for opposing the bill offers some back-and-forth (hopefully Clement won’t follow through on her threat to delete those comments she doesn’t like), including a link to the Association of American Publisher’s competing point of view.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Current Events, Scholarly Publishing, Technology

September 20, 2011, 10:14 am

Game Theory

By Henry Woodbury

Why would scientific experts call on gamers to solve problems in protein folding? Here’s why:

“People have spatial reasoning skills, something computers are not yet good at,” [Dr. Seth Cooper, of the UW Department of Computing Science and Engineering] said. “Games provide a framework for bringing together the strengths of computers and humans.”

The game goes by the name Foldit and is supported by University of Washington Center for Game Science.

When you first start playing the game takes you through a series of practice examples to get you familiar with the manipulations you can apply to a protein chain.

Foldit Intro Puzzle 3 of 32

If you get hooked, you continue to real problems. Already, game-generated models have helped researchers resolve the structure of previously undefined proteins. Researchers are also looking at some of the unfolding sequences used by Foldit players to develop better algorithms for computer analysis.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: 3D Modeling, Creativity, Scholarly Publishing, Social Media, Technology

July 28, 2011, 12:58 pm

Hello Spatial Humanities, We’ve been Waiting for You

By Henry Woodbury

Patricia Cohen at The New York Times has an interesting article on the “spatial humanities,” the idea of using geographic information systems to reveal the physical context of historical or even fictional events:

“Mapping spatial information reveals part of human history that otherwise we couldn’t possibly know,” said Anne Kelly Knowles, a geographer at Middlebury College in Vermont. “It enables you to see patterns and information that are literally invisible.” It adds layers of information to a map that can be added or taken off at will in various combinations; the same location can also be viewed back and forth over time at the click of a mouse.

The real joy of this feature is the portfolio of projects that accompanies the main overview. Here, for example, is a section from Ms. Knowles’ viewshed analysis of what General Robert E. Lee could actually see in the Battle of Gettysburg:

Fragment of Gettysburg Map created by Anne Kelly Knowles, Will Rousch, Caitrin Abshere and others; and National Archives, Maryland

The pale ovals represent areas that historians have previously assumed to be visible to Lee. In Ms. Knowles analysis, all the light areas of the map could have been visible, depending on tree lines.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: 3D Modeling, Diagrams, Information Design, Maps, Scholarly Publishing, Visual Explanation

July 29, 2010, 12:26 pm

Historic New England’s Collections Online

By Kirsten Robinson

The Portsmouth Herald has published an article about Historic New England’s new web site and online collections project, for which Dynamic Diagrams provided web strategy, information architecture and design services, as well as project management for the site’s development.

You can view the web site at www.historicnewengland.org or dive right into searching and browsing the online collections — full of photos, artifacts, and reference materials having to do with 400 years of New England History.

We’re currently in the final stage of the project, conducting usability tests on the new site.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Dynamic Diagrams News, Information Architecture, Scholarly Publishing, Usability, User Experience, Web Interface Design

June 3, 2010, 11:08 am

Visual Bias at Work

By Henry Woodbury

Last week I blogged about a Harvard Business Review article on the inherent biases in visualization. Visual information makes people overconfident of outcomes.

Today the New York Times offers a perfect example. In the debate around U.S. health care overhaul, the president’s budget director Peter Orszag argued that savings could be found by reforming the current system:

Mr Orszag displayed maps produced by Dartmouth researchers that appeared to show where the waste in the system could be found. Beige meant hospitals and regions that offered good, efficient care; chocolate meant bad and inefficient.

The maps made reform seem relatively easy to many in Congress, some of whom demanded the administration simply trim the money Medicare pays to hospitals and doctors in the brown zones. The administration promised to seriously consider doing just that. [my emphasis]

Unfortunately, the maps don’t show what they seem to show. While they show cost of care (a very specific kind of care it should be noted), they don’t show quality of care. Nor do the maps show anything about the demographics of the patients being cared for.

The Times compares the Dartmouth map (on the left) to Medicare’s own analysis of hospital quality (on the right) to show the disconnect. However, the Medicare map raises questions of its own. To start with, it shows a suspicious correspondence to U.S. population density.

Health Care Cost vs.  Quality (New York Times)

Perhaps quality of care relates to the proposition that higher population density creates demand for more specialists which leads to better diagnoses. I’m sure I’m not the first person to think of this. Before anyone draws another map, let’s work on better analysis.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Charts and Graphs, Cognitive Bias, Current Events, Information Design, Maps, Scholarly Publishing, Visual Explanation

February 20, 2010, 10:42 am

Visualizing More Affordable Care

By Henry Woodbury

The February 2010 issue of Obstetrics & Gynecology features work by Dynamic Diagrams for an article titled Alternatives to a Routine Follow-Up Visit for Early Medical Abortion. The article describes a protocol for assessing a woman’s health after an abortion without routine use of ultrasonography. To quote from the abstract:

We constructed five model algorithms for evaluating women’s postabortion status, each using a different assortment of data. Four of the algorithms (algorithms 1–4) rely on data collected by the woman and on the results of the low-sensitivity pregnancy test. Algorithm 5 relies on the woman’s assessment, the results of the pregnancy test, and follow-up physician assessment (sometimes including bimanual or speculum examination).

A sponsor of the study, Gynuity Health Products, asked Dynamic Diagrams to visualize the data. Our explanation shows the results for the current standard of care and five algorithms tested by the researchers. For each approach we show the total number of cases, the number of women returning to a clinic for a follow-up visit, and the number of women receiving a follow-up ultrasound. In contrasting colors we show specific additional treatment cases in two columns; those identified by the protocol on the left vs. those not necessarily identified by the protocol on the right. In large type we provided the percentage of the number of follow-up ultrasounds to the total number of cases. This combination of rich data points and a key percentage makes it easy to compare the effectiveness of each algorithm. A sample of this visual language (without labels) is shown below:

Alternatives to a Routine Follow-Up Visit for Early Medical Abortion, Figure 2

While we cannot reprint the full text of article, we can provide the visual explanation used as Figure 2: Algorithms identifying women who received additional care after medical abortion (PDF, 409K).

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Books and Articles, Charts and Graphs, Dynamic Diagrams News, Infographics, Information Design, Scholarly Publishing, Visual Explanation

September 30, 2009, 9:10 am

TDR Launches New Interface Design

By Henry Woodbury

Today TDR updated their site with a new banner, color palette, and home page layout.

The design and roll out resulted from close partnership between TDR, Dynamic Diagrams, and other consultants. The result is a fresh look and a home page layout that reflects the evolving use of the site.

Co-sponsored by UNICEF, UNDP, the World Bank, and WHO, TDR, “funds research in infectious diseases of poverty, and provides support and training to researchers and institutions in the countries where these diseases occur.”

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Dynamic Diagrams News, Scholarly Publishing, Web Interface Design

October 31, 2008, 10:51 am

Dynamic Diagrams Poster Part of Award-Winning Conference Presentation

By Mac McBurney


Congratulations to Colette Hannan on winning the Young Scientist Award for best poster presentation at the 17th International Conference of Racing Analysts and Veterinarians. Colette won the award for her presentation, “Controlling therapeutic substances – a European harmonised approach: Determination of the detection time for lidocaine following an administration to horses.”

Dynamic Diagrams designed posters for five research studies conducted by BHP Labs in Limerick, Ireland, where Colette works as a chemist. The studies tracked how long drugs like lidocaine and morphine remain — or remain detectable — in race horses. These drugs are legitimate veterinary medications, but they’re a big no-no if your horse tests positive on race day.

The posters present research data and findings to a scientific audience, so we retained the organizing principles of a scientific poster or paper (methods, results, conclusions). In the central graphs, a circular blow-up shows the data of greatest interest. A timeline down the center shows when blood and urine samples were collected from the horses.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Dynamic Diagrams News, Illustration, Information Design, Scholarly Publishing

May 5, 2008, 2:05 pm

Harvard Business Review Discovers “Emerging Science of Visualization”

By Mac McBurney

Martin Wattenberg and Fernanda Viégas, the two best-known creators of IBM Research’s Many Eyes, brief business execs on the benefits of collaborative information visualization.

Our research has found that the compelling presentation of data through visualization’s advanced techniques generates a surprising volume of impassioned conversations. Viewers ask questions, make comments, and suggest theories for why there’s a downward trend here or a data cluster there. That level of engagement could foster the kind of grassroots innovation CEOs dream of.

The article is available in the May 2008 issue of Harvard Business Review and for free online (at least for now):

You’ll also find Viégas and Wattenberg in MoMA’s Design and the Elastic Mind exhibition.

Finally, for even more info-vis star-watching, Viégas and two other designers will join John Maeda (an info design rockstar if ever there was one) later this month for IN/VISIBLE: Graphic Data Revealed. From the event’s blurb:

The visual ethics required in information graphics increase the designer’s burden from faithful executor to editorial arbiter. How do design choices affect the integrity of the data being portrayed?

If you see me there, say hello: http://www.aigany.org/events/details/08FD/

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Art, Books and Articles, Business, Current Events, Design, Information Design, Scholarly Publishing, Visual Explanation

January 2, 2008, 11:10 am

American Physical Society Launches Dynamic Diagrams Redesign of Physical Review Letters

By Lisa Agustin

The American Physical Society’s flagship journal, Physical Review Letters, has a new look and feel, thanks to a redesign by Dynamic Diagrams. Along with an updated masthead, the redesign features clearer navigation options, quick access to content from the current issue via a tabbed interface, and a snapshot of the latest news and most cited papers. As part of the PRL redesign, Dynamic Diagrams also designed a sub-site to commemorate the journal’s 50th anniversary, which includes an interactive timeline of notable papers and events since 1893. PRL’s new visual design is part of a larger effort to create a new visual design system that will be applied to eight additional journals and the APS Journals umbrella site. Redesigned versions of these sites will be launched in the coming months.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Dynamic Diagrams News, Scholarly Publishing, Web Interface Design

October 4, 2007, 1:08 pm

Charts, Graphs, and Narrative

By Henry Woodbury

In an interview in Inside Higher Ed, economist Robert Frank discusses the problem of teaching the fundamental concepts of his discipline. Researchers found that students coming out of an introductory economics class scored worse on an applicable exam than those who had never take any economics courses whatsoever. So Frank, with co-author Ben Bernanke, wrote a new standard text.

While economics is the pivot for the interview, Frank offers many insights about how people gather and use information:

The narrative theory of learning now tells us that information gets into the brain a lot more easily in some forms than others. You can get information into the student’s brain in the form of equations and graphs, yes, but it’s a lot of work to do that. If you can wrap the same ideas around stories, around narratives, they seem to slide into the brain without any effort at all. After all, we evolved as storytellers; that’s what we’re good at. That’s how we always exchanged ideas and information. And if a narrative has an actor, a plot, if it makes sense, then the brain stores it quite easily; you can pull it up for further processing without any effort; you can repeat the story to others. Those seem to be the steps that really make for active learning in the brain.

Then there’s this pithy definition of behavioral economics:

One of its founders, Amos Tversky, was a psychologist at Stanford. He liked to say his colleagues study artificial intelligence; he prefers to study natural stupidity — the cognitive errors people are prone to make. It’s not that we’re stupid, but we use heuristics, we use rules of thumb, and the heuristics work well enough on average across a broad range of circumstances, but unless you really understand the logic of weighing costs and benefits, it’s very easy to be fooled into making the wrong decision.

Sounds like usability research, no?

Frank is also author of The Economic Naturalist: In Search of Explanations for Everyday Enigmas and periodic essayist for the New York Times.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Cognitive Bias, Scholarly Publishing

June 20, 2007, 10:48 am

Speciespedia

By Henry Woodbury

Almost three hundred years ago, the great doctor and zoologist Carl Linnaeus published Systema Naturae, an attempt to classify all living things by scientific principles. Linnaeus’ work marks a watershed in the use of a hierarchical taxonomy to name living organisms. Over 13 editions, Systema Naturae grew from an 11 page pamphlet to a dense 3000 page catalog as students and colleagues sent Linnaeus specimens from their travels.

In this spirit of research and collaboration, a group of leading scientific institutions have grouped together to create the Encyclopedia of Life, a Web site designed to identify all living species:

Over the next 10 years, the Encyclopedia of Life will create Internet pages for all 1.8 million species currently named. It will expedite the classification of the millions of species yet to be discovered and catalogued as well. The pages…will provide written information and, when available, photographs, video, sound, location maps, and other multimedia information on each species. Built on the scientific integrity of thousands of experts around the globe, the Encyclopedia will be a moderated wiki-style environment, freely available to all users everywhere.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing

November 30, 2006, 11:05 am

Scholarly Publishing Meets YouTube

By Lisa Agustin

One of the challenges in scientific research involves the transfer of knowledge:  explaining, and then understanding and learning laboratory techniques.  This can be a time-consuming process, especially if the techniques are state-of-the-art or experimental.  While written protocols are often quite detailed, even these can be prone to misinterpretation. The newly released Journal of Visualized Experiments wants to address the knowledge-transfer hurdle by offering video-based (“visualized”) biological research studies online.

By presenting research in the form of “video-articles,”  the equipment, samples, and steps taken become transparent.  (Supporting written documentation is also provided.) JoVE is similar to the traditional scientific journal in that researchers are invited to submit their work, which is then reviewed by an editorial board before being posted. In the future, JoVE plans to list their offerings in PubMed and other databases. In the true spirit of the Web, submissions and access to the journal will be free. One only hopes that this model will be able to sustain itself via funding or other means for the benefit of the larger scientific community.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Visual Explanation

May 18, 2006, 8:32 am

The Universal Library and Who Owns It

By Henry Woodbury

The New York Times Magazine this week sports a long essay by Kevin Kelly about the possibilities of an electronic, universal library:

When fully digitized, [all the information in the world] could be compressed (at current technological rates) onto 50 petabyte hard disks. Today you need a building about the size of a small-town library to house 50 petabytes. With tomorrow’s technology, it will all fit onto your iPod. When that happens, the library of all libraries will ride in your purse or wallet — if it doesn’t plug directly into your brain with thin white cords.

As a “senior maverick” at Wired magazine, Kelly unfolds some very interesting and imaginative possibilities. After discussing the obvious advantages of linked bibliographies and cross-referencees, Kelly elaborates on “Books: the Liquid Version”:

At the same time, once digitized, books can be unraveled into single pages or be reduced further, into snippets of a page. These snippets will be remixed into reordered books and virtual bookshelves. Just as the music audience now juggles and reorders songs into new albums (or “playlists,” as they are called in iTunes), the universal library will encourage the creation of virtual “bookshelves” — a collection of texts, some as short as a paragraph, others as long as entire books, that form a library shelf’s worth of specialized information. And as with music playlists, once created, these “bookshelves” will be published and swapped in the public commons. Indeed, some authors will begin to write books to be read as snippets or to be remixed as pages.

At the moment, writes Kelly, the real obstacle facing the universal library isn’t technology, but copyright:

In the world of books, the indefinite extension of copyright has had a perverse effect. It has created a vast collection of works that have been abandoned by publishers, a continent of books left permanently in the dark. In most cases, the original publisher simply doesn’t find it profitable to keep these books in print. In other cases, the publishing company doesn’t know whether it even owns the work, since author contracts in the past were not as explicit as they are now. The size of this abandoned library is shocking: about 75 percent of all books in the world’s libraries are orphaned. Only about 15 percent of all books are in the public domain. A luckier 10 percent are still in print. The rest, the bulk of our universal library, is dark.

Google has an answer. But it’s being contested by publishers. Read the article to get the gory details.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Business, Scholarly Publishing, Technology

December 9, 2005, 10:34 am

Creating a Digital Past

By d/D

Conserving digital information is turning out to be a tremendously complicated endeavor. One response is the DSpace Digital Repository, an open source platform for archiving electronic files developed by the MIT libraries and Hewlett Packard. In an article in IEEE’s Spectrum, Mackenzie Smith of the MIT Libraries gets into the details:

[S]aving raw data solves only part of the preservation problem. We also want to be able to read, play, or watch these bits when we need to. Then there are pesky legal obligations, which demand that we be able to guarantee that certain records haven’t been altered by human hands or computer malfunction.

http://www.spectrum.ieee.org/jul05/1568

This is a project we were pleased to work on, creating a visual explanation that MIT can couple with articles and white papers to increase understanding of the technology.

Our DSpace case study on our Web site includes a PDF version of the diagram:

ttp://www.dynamicdiagrams.com/case_studies/mit_dspace.html

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Visual Explanation

April 7, 2005, 1:32 pm

The New York Public Library Digital Gallery

By d/D

In a triumph of open access, the New York Public Library’s Digital Gallery delivers a huge number of historic and cultural images from the library’s collections to the Web:

“There are over two hundred thousand individual items now and growing, for many collections are still in production and/or development. The goal is half a million images and more, though they will still represent but a fraction of the Research Libraries’ overall visual holdings….The low-resolution images available on the website provide good-quality reference copies for a wide range of educational, creative, and research purposes.”

http://digitalgallery.nypl.org/nypldigital/index.cfm

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing

January 12, 2005, 2:29 pm

For Academics: Blog or Perish?

By d/D

Professor Tyler Cowen of George Mason University addresses the question: “how [do] blogging and academic scholarship go together? In specific, he wonders what might have inspired Professors Richard Posner and Gary Becker to enter the fray:

“I’ve heard that if Posner were a law school, his citation index would put him in or close to the top ten. And Becker just gave up his Business Week column a few months ago. He is also the most widely cited living economist, not to mention that Nobel Prize. So why are they blogging?”

http://www.marginalrevolution.com/marginalrevolution/2004/11/the_scholarly_c.html

The Becker-Posner blog is at:

http://www.becker-posner-blog.com/

Cowen credits Northwestern University Professor Eszter Hargittai with raising the question; Hargittai’s writings include some thought-provoking ideas and many links to other opinions on the subject:

“There are posts on blogs that are certainly much more original and careful in their arguments (and more clearly written) than many articles that get published in academic journals. I think people’s reluctance to consider blog writing as comparable to journal publishing comes from thinking about journals in a somewhat romanticized and unrealistic manner.”

http://www.crookedtimber.org/archives/002884.html

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Technology

January 12, 2005, 2:27 pm

Collective Editing via Wiki

By d/D

Five years ago, Harvard Law Professor Lawrence Lessig published Code and Other Laws of Cyberspace, a well-reviewed book that argues that the Internet’s hardware and software protocols determine how the medium is controlled by vested interests.

To update the book, Lessig has decided to post its contents to a Wiki, a platform for collaborative editing by everyday users (most famously in the Wikipedia encyclopedia). Lessig will then edit the Wiki-based updates to produce the final new edition:

“My aim is not to write a new book; my aim is to correct and update the existing book. But I’m eager for advice and expert direction…. No one can know whether this will work. But if if does, it could be very interesting.”

http://www.lessig.org/blog/archives/002358.shtml

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Books and Articles, Scholarly Publishing, Technology

December 8, 2004, 2:53 pm

The OCLC Best Seller List

By d/D

The OCLC Online Computer Library Center has compiled a list of the “Top 1000″ published works, based on the holdings of its member libraries. While somewhat an exercise in trivia (check out the “Factoids” page), the OCLC’s researchers have effectively created a book list “hub” with multiple ways to view their own list and links to many other “top books” lists.

http://www.oclc.org/research/top1000/default.htm

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing

December 8, 2004, 2:48 pm

The Online News… from 1836

By d/D

This past month, National Endowment for the Humanities Chairman Bruce Cole announced a project with the U.S. Library of Congress to place 30 millions pages of old newspapers online:

“Now, with this new digital program, you will see the papers just as they were–you will be able to search the actual page. The technique is OCR–optical character recognition. In fact, there is already a model up on the Library of Congress site. It’s got the Stars and Stripes from World War One. It shows you the whole page and there’s a zoom device so you can focus in on a single story and be able to read it. It’s key word searchable. It’s a quantum leap from trying to read microfilm.”

The archive will start in 1836, the point at which the OCR technology can read typical newspaper type, and end in 1922, after which copyright issues come into play. However, all newspapers published in the United States, from 1690 to the present, will be included in an associated online bibliography.

http://www.neh.gov/whoweare/speeches/11162004.html

To see how the technology works, you can go to the Library of Congress’ Stars and Stripes archive and select any issue:

http://memory.loc.gov/ammem/sgphtml/sashtml/sashome.html

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Technology

December 8, 2004, 2:44 pm

Google Rules for Scholarly Content

By d/D

Google’s new “Scholar Google” (http://scholar.google.com/) is a public search engine specifically targeted to scholarly information. Of interest are the implications of Google’s typically terse recommendations for submitting and accessing different kinds of content. Regarding abstracts, for example, Google requires open access:

“Regardless of the source, you should be able to see an abstract for any article, with the exception of those that are offline and referenced in citations only. Please let us know if you don’t see even an abstract.”

These are issues we’ve encountered many times in our work for university publishers and professional associations. Google’s recommendations are likely to start turning good practices into industry standards.

http://scholar.google.com/scholar/about.html

Google Scholar is generating a lot of interest online; here are two reports:

http://searchenginewatch.com/searchday/article.php/3437471

http://www.resourceshelf.com/2004/11/wow-its-google-scholar.html

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Information Architecture, Scholarly Publishing, Technology

October 13, 2004, 9:57 am

The Afterlife of Digital Information

By d/D

A recent story on National Public Radio describes a Library of Congress’ initiative to preserve digital information that can propagate, change, and disappear without a trace. As of December 2000, the Internet, just one digital medium, had more than 4 Billion Web pages whose average life was 44 days.

Speaking to NPR’s Robert Siegel, Laura Campbell of the Library of Congress compared the repository to government photography archives from World War II:

“We can’t collect everything but we can certainly take a snapshot in time to tell the story about what local life was like…. We will have people who go through and sample what’s on the Web so that we can create that same kind of archive.”

http://www.npr.org/rundowns/segment.php?wfId=4062797

The Digital Preservation Program web site describes the breadth of the the undertaking. Projects range from the technical development of Web archiving tools to the funding of specific archives to the defining of metadata standards:

http://www.digitalpreservation.gov/about/

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Technology

August 12, 2004, 3:40 pm

Update: The Tube as Template

By d/D

We recently came across another example of the London Underground map as design template. In this case it is repurposed as “A subway map of cancer pathways”:

http://www.nature.com/nrc/poster/subpathways/index.html

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Information Design, Maps, Scholarly Publishing

April 8, 2004, 8:36 am

Rethinking Encyclopedias

By d/D

Over the past decade the expansion of electronic alternatives has dramatically undermined the encyclopedia market. In response, publishers are looking for ways to obtain more value from their content:

“The shrunken reference powers that survived the shakeout — namely Britannica, World Book, and Grolier, the maker of Encyclopedia Americana now owned by Scholastic Library Publishing — have now retooled to focus more on online products.

“Voluminous sets are still printed, but mostly only for institutions. The encyclopedia companies are also targeting consumers with more concise and less expensive reference books.”

Online, the possibilities are exciting. The same data that drives an encyclopedia Web site could be queried by many different kinds of customized informational tools. The success of such tools, however, depends upon equally customized information architectures, each tailored to help a specific audience extract meaningful information from a specific body of content.

http://www.cnn.com/2004/TECH/internet/03/11/disappearing.encyclopedia.ap/

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Technology

January 8, 2004, 9:41 am

No More Unbinding and Rebinding

By d/D

In its understated way, Micropaleontology Press, feature of a recent NPR story (http://www.npr.org/features/feature.php?wfId=1572223), points out another advantage of electronic media:

“In 2003, the Foraminifera Catalogue reached 106 looseleaf volumes containing more than 87,000 pages … Since all the printed volumes must be unbound and rebound each year for the alphabetic insertion of 500 to 600 additional pages, the internet edition has quickly become popular.”

See the very bottom of http://micropress.org/history.html

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Technology

January 8, 2004, 9:40 am

New Pricing Model from HighWire Press

By d/D

The dissatisfaction of academic librarians with current purchasing options is affecting electronic journal aggregators such as HighWire Press. Teaming with a group of scholarly society publishers, HighWire Press recently announced an alternative to bundled subscription packages:

“Initiated by a group of scholarly society publishers participating in HighWire, the new pricing/subscription model offers an alternative to the ‘Big Deal’ packages and allows librarians to create their own packages using tiered pricing tied to library type.”

http://www.infotoday.com/newsbreaks/nb031208-2.shtml

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing

December 8, 2003, 9:47 am

Dead Links and Scholarly Research

By d/D

When footnotes are URLS, footnotes disappear. Faster than you may think:

“In research described in the journal Science last month, the team looked at footnotes from scientific articles in three major journals — the New England Journal of Medicine, Science and Nature — at three months, 15 months and 27 months after publication. The prevalence of inactive Internet references grew during those intervals from 3.8 percent to 10 percent to 13 percent.”

http://www.washingtonpost.com/ac2/wp-dyn/A8730-2003Nov23

Mentioned in the article is the digital object identifier system known as DOI. This is a system we’ve seen used effectively in our work for scientific publishers. The DOI web site is http://www.doi.org/.

  • Facebook
  • Twitter
  • del.icio.us
  • StumbleUpon

Comments (0) | Filed under: Scholarly Publishing, Technology