Information Design Watch
October 13, 2004, 3:18 pm
In a press release, The Wikimedia foundation announces the publication of the one millionth article in its free, open-content, online encyclopedia.
“Wikipedia is created entirely by volunteers who contribute, update, and revise articles in a collaborative process…. Contributors build upon each other’s changes and flawed edits are quickly repaired. ‘Everything is peer-reviewed in real time,’ said [Wikipedia founder Jimmy] Wales.”
The English version of Wikipedia is at:
October 13, 2004, 9:57 am
A recent story on National Public Radio describes a Library of Congress’ initiative to preserve digital information that can propagate, change, and disappear without a trace. As of December 2000, the Internet, just one digital medium, had more than 4 Billion Web pages whose average life was 44 days.
Speaking to NPR’s Robert Siegel, Laura Campbell of the Library of Congress compared the repository to government photography archives from World War II:
“We can’t collect everything but we can certainly take a snapshot in time to tell the story about what local life was like…. We will have people who go through and sample what’s on the Web so that we can create that same kind of archive.”
The Digital Preservation Program web site describes the breadth of the the undertaking. Projects range from the technical development of Web archiving tools to the funding of specific archives to the defining of metadata standards:
October 10, 2004, 3:27 pm
MoreGoogle is a small downloadable program that enhances Google search results with screen images, links and other bits of contextual information. This all happens at the browser level, after Google serves the page:
“…you get the exact, original Google search results. After your browser has loaded the search results, MoreGoogle adds features, but does not alter the results in any way.”
Since the beginning of the Web, designers and developers have debated whether or not content should be separated from design. Up to now, the debate has focused on the extent to which a client program should control design. With MoreGoogle and tools such as PurpleSlurple(TM), Web content is shown to be equally malleable.
October 10, 2004, 3:26 pm
In recognition of the upcoming U.S. Presidential election, we link to the Electoral College map on Princeton University Professor Sam Wang’s Electoral College Meta-Analysis Web site. The map resizes each U.S. state to correspond to its share of electoral votes; it is arguably “truer” to its data than geographically-based projections. The map is also interactive; individual states can be assigned to one of the two candidates (or neither) to show different possible electoral outcomes.
http://synapse.princeton.edu/~sam/pollcalc.html#EVmap (click on static map to see interactive map in a popup window)
The University of Virginia Library offers a collection of more traditional electoral maps (1860 to 1996):