2009 electronic resources & libraries conference

8
groups on current news and events. Les Hawkins (Library of Congress) provided a brief update on the activities of CONSER, discussing the impact of the reorganization at the Library of Congress on CONSER activities and highlighting the new training Web page (http://www.loc.gov/catworkshop/) and noting that the majority of the materials are available free of charge. Hawkins informed attendees that the CONSER Standard Record guidelines have been updated and that there are several MARBI proposals on defining the 588 fields for use with serials description based onand latest issue consultednotes. He also highlighted the use of the Elluminate live-online conferencing software by LC and the University of California CONSER Funnel for training and meeting purposes. Finally, a train-the-trainer session will be held at ALA Annual for the SCCTP holdings workshop. Following Hawkins, Regina Romano Reynolds (Library of Congress) gave her ISSN report and gave an update on the impact of the reorganization at LC on ISSN assignment. Assignment will now be dispersed among several sections, with Reynolds, as coordinator of the efforts. LC is also implementing an automated ISSN assignment program that will provide the user with the next available ISSN so the LC community can move beyond the paper ledger system. The ISSN manual has been updated and is now available online. The ISSN-L is set to be implemented once OCLC has added the necessary subfields during the next MARC update. Currently OCLC is working on several clean-up projects within the ISSN Centre database in preparation for populating it with the ISSN-L. There are tables available for free that will be updated quarterly, mapping the medium-specific ISSNs to the ISSN-L and vice versa. When it comes to programs to digitize back runs of serials without ISSNs, Reynolds confirmed that if the work is done by a publisher, the publisher can apply for an ISSN for the run but if the work is done by an aggregator is not able to apply for an ISSN. There is also a new program called Piloting an E-journals Preservation Registry Service. After these updates, the main presentation began with Steve Shadle (University of Washington) and Peter McCracken (Serials Solutions) discussing work with vendor records. Shadle began by giving a brief history of the various arguments and approaches to loading vendor records, their sources, and the benefits of records. In the past, records for electronic resources were dealt with on a title-by-title basis, a slow process. Currently, loading records is mainly a batch process. Ultimately, loading vendor records has always increased the use of the material, making it worth the time and effort. Record sources for e-journal packages include PAMS (publication access management services) and ERAMS. Once available records have been identified, libraries have to choose the scope of what you put in the catalog; options include packages and aggregators, sets, GPO materials from MARCIVE, and freely available items. After discussing the general points, Shadle then moved into a description of the experience at the University of Washington (UW). The University of Washington elected to load sets from Serials Solutions, MARCIVE, in addition to various other sets of analytics. Issues include variable record quality, deciding on multiple versus single record approach (UW opted for the single record approach), set management, and other possible uses of the data. With the option of the single record approach, data manipulation is necessary to match records in the system. Other issues Shadle discussed included maintenance efforts and set management, including where the data are kept and how they are updated. A-to-Z lists are also fed by the record loads, and they use an e-serials holdings service to help maintain the data. Ultimately, multiple processes they are needed to manage the record loads. Peter McCracken presented on the vendor perspective on providing MARC records. There's been a massive growth in the number of e-journals in libraries. McCracken stressed the importance of focusing on the core competencies and finding what provides the greatest value to patrons. Vendors must be willing to adapt to change and make it work for what customers want. Other key points are to simplify and make the process as efficient as possible. When it comes to new product development, McCracken discussed that vendors need to find partners that want to be involved and that are willing to do the work. Above all, thinking impress the librariansis key to product development. The main challenge of development and providing MARC records is that it all costs something, and not just economically, but also in opportunity and time investments. McCracken's presentation was followed by a question-and- answer session with both presenters. Key points addressed included the importance of knowing system capabilities, the importance of a testing environment in which to solve problems and review, and the importance of reviewing possible MARC record sets and making decisions in a collaborative environment. Ultimately, vendors and libraries you have to figure out what is important to users, and determine the minimum functionalities for resource search and retrieval. Notes 1. For the Table of Contentsand Executive Summaryof the spring 2007, Association of Research Libraries SPEC Kit survey on metadata practices, see http://www.arl.org/bmdoc/spec298web.pdf (accessed May 16, 2009). 2. For a list see http://www.lockss.org/lockss/Publishers_and_Titles (accessed May 16, 2009). doi:10.1016/j.serrev.2009.05.007 2009 Electronic Resources & Libraries Conference Posie Aagaard, Helen Heinrich, Regina Koury and Lisa Kurt Pre-conference: Marketing/Promoting Effective Use of E-resources Barbie Elene Keiser (Barbie E. Keiser Inc.) delivered a half-day information-packed session on effective marketing of e-resources, which was an overview of the four-day workshop she conducts. She drew an audience from academic, special, medical, and corporate libraries, as well as vendors. If you were looking for a fast how-to marketing session, this was not it; Keiser left no stone unturned. She provided in-depth information regarding marketing challenges, targeting user groups, methodologies for conducting user group assessments and audits and mixing appropriate techniques and tools. Keiser stressed that marketing is everyone's job; it is a strategic behavior, a process of evaluating how well have you done, adapting what works, and understanding the potential of branding. Begin marketing with your target audience and their distinct needs in mind. Look at what high schools students are doing; their behaviors constitute the next generation of user needs. An integral part of every marketing plan is a SWOT (strengths, weaknesses, opportunities, and threats analysis) and its extension, the TOWS matrix, which allows you to analyze your external Blythe / Serials Review 35 (2009) 170192 174

Upload: posie-aagaard

Post on 29-Jun-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2009 Electronic Resources & Libraries Conference

009 Electronic Resources & Libraries Conference

Blythe / Serials Review 35 (2009) 170–192

174

pnimwwweMtotTto

ainimarUimr

N

1

2

d

2

osie Aagaard, Helen Heinrich, Regina Kourynd Lisa Kurt

re-conference: Marketing/Promoting Effectivese of E-resources

arbie Elene Keiser (Barbie E. Keiser Inc.) delivered a half-dayformation-packed session on effective marketing of e-resources,hich was an overview of the four-day workshop she conducts.he drew an audience from academic, special, medical, andorporate libraries, as well as vendors. If you were looking for ast how-to marketing session, this was not it; Keiser left no stonenturned. She provided in-depth information regarding marketinghallenges, targeting user groups, methodologies for conductingser group assessments and audits and mixing appropriatechniques and tools.Keiser stressed that marketing is everyone's job; it is a

trategic behavior, a process of evaluating how well have youone, adapting what works, and understanding the potential ofranding. Begin marketing with your target audience and theiristinct needs in mind. Look at what high schools students areoing; their behaviors constitute the next generation of usereeds.An integral part of every marketing plan is a SWOT (strengths,eaknesses, opportunities, and threats analysis) and its extension,

Pa

PU

BinwScfaucute

sdbddn

w

the TOWS matrix, which allows you to analyze your external

groups on current news and events. Les Hawkins (Library ofCongress) provided a brief update on the activities of CONSER,discussing the impact of the reorganization at the Library ofCongress on CONSER activities and highlighting the new trainingWeb page (http://www.loc.gov/catworkshop/) and noting thatthe majority of the materials are available free of charge. Hawkinsinformed attendees that the CONSER Standard Record guidelineshave been updated and that there are several MARBI proposals ondefining the 588 fields for use with serials “description based on”and “latest issue consulted” notes. He also highlighted the use ofthe Elluminate live-online conferencing software by LC and theUniversity of California CONSER Funnel for training and meetingpurposes. Finally, a train-the-trainer session will be held at ALAAnnual for the SCCTP holdings workshop.

Following Hawkins, Regina Romano Reynolds (Library ofCongress) gave her ISSN report and gave an update on the impactof the reorganization at LC on ISSN assignment. Assignment willnow be dispersed among several sections, with Reynolds, ascoordinator of the efforts. LC is also implementing an automatedISSN assignment program that will provide the user with the nextavailable ISSN so the LC community can move beyond the paperledger system.

The ISSN manual has been updated and is now available online.The ISSN-L is set to be implemented once OCLC has added thenecessary subfields during the next MARC update. Currently OCLCis working on several clean-up projects within the ISSN Centredatabase in preparation for populating it with the ISSN-L. There aretables available for free that will be updated quarterly, mapping themedium-specific ISSNs to the ISSN-L and vice versa.When it comesto programs to digitize back runs of serials without ISSNs, Reynoldsconfirmed that if the work is done by a publisher, the publisher canapply for an ISSN for the run but if the work is done by anaggregator is not able to apply for an ISSN. There is also a newprogram called Piloting an E-journals Preservation RegistryService.

After these updates, the main presentation began with SteveShadle (University of Washington) and Peter McCracken (SerialsSolutions) discussing work with vendor records. Shadle began bygiving a brief history of the various arguments and approaches toloading vendor records, their sources, and the benefits of records.In the past, records for electronic resources were dealt with on atitle-by-title basis, a slow process. Currently, loading records ismainly a batch process. Ultimately, loading vendor records hasalways increased the use of the material, making it worth the timeand effort. Record sources for e-journal packages include PAMS(publication access management services) and ERAMS. Onceavailable records have been identified, libraries have to choosethe scope of what you put in the catalog; options include packagesand aggregators, sets, GPO materials from MARCIVE, and freelyavailable items.

After discussing the general points, Shadle then moved into adescription of the experience at the University of Washington(UW). The University of Washington elected to load sets fromSerials Solutions, MARCIVE, in addition to various other sets ofanalytics. Issues include variable record quality, deciding onmultiple versus single record approach (UW opted for the singlerecord approach), set management, and other possible uses of thedata. With the option of the single record approach, datamanipulation is necessary to match records in the system. Otherissues Shadle discussed included maintenance efforts and setmanagement, including where the data are kept and how they areupdated. A-to-Z lists are also fed by the record loads, and they usean e-serials holdings service to help maintain the data. Ultimately,multiple processes they are needed to manage the record loads.

Peter McCracken presented on the vendor perspective onroviding MARC records. There's been a massive growth in theumber of e-journals in libraries. McCracken stressed theportance of focusing on the core competencies and findinghat provides the greatest value to patrons. Vendors must beilling to adapt to change and make it work for what customersant. Other key points are to simplify and make the process asfficient as possible. When it comes to new product development,cCracken discussed that vendors need to find partners that wantbe involved and that are willing to do the work. Above all,

hinking “impress the librarians” is key to product development.hemain challenge of development and providingMARC records ishat it all costs something, and not just economically, but also inpportunity and time investments.McCracken's presentation was followed by a question-and-

nswer session with both presenters. Key points addressedcluded the importance of knowing system capabilities, theportance of a testing environment in which to solve problems

nd review, and the importance of reviewing possible MARCecord sets and making decisions in a collaborative environment.ltimately, vendors and libraries you have to figure out what isportant to users, and determine theminimum functionalities for

esource search and retrieval.

otes

. For the “Table of Contents” and “Executive Summary” of the spring 2007,Association of Research Libraries SPEC Kit survey on metadata practices, seehttp://www.arl.org/bm∼doc/spec298web.pdf (accessed May 16, 2009).

. For a list see http://www.lockss.org/lockss/Publishers_and_Titles (accessedMay 16, 2009).

oi:10.1016/j.serrev.2009.05.007

Page 2: 2009 Electronic Resources & Libraries Conference

(threats and opportunities) and internal environment (weak-nesses and strength). Another important part of marketing islibrary staff; it's not the number of your staff, but the skill set theyhave. For successful marketing, Keiser suggested a need to create ateam of players with a variety of skill sets, emphasize creativity andinnovation, train in communication and behavioral skills, andmentor your staff.

Don't be bound by preconceived notions of what the library canoffer, broaden the base of products and services offered by the libraryand expand the user base. Keiser looked at the product's bell-shapedlife cycle and how the elements of the life cycle−introduction,growth, and maturity−relate to one another, whether they makesense to the patrons and if they help attain economies of scale.

Some of the practical marketing tools Keiser mentioned weremarketing literature (brochures, catalogs, fact sheets, coupons),social networking sites, visualization tools (Space Tree, Anacubis,Web Brain), IM andmobile devices and, of course, word-of-mouth.

Welcome and Keynote Address: From Access toAssociations: Making Connections with ElectronicResources

Elizabeth Goodman (PhD student, University of California, Berke-ley, School of Information) provided the keynote address. Good-man's writing, research and design work focus on ubiquitouscomputing, and her background is in interactive design. Hercurrent work centers on how information technology affecteveryday places and activities in urban environments.

Goodman began with interesting examples of ubiquitouscomputing; she described a scenario where a mobile platformwould be used by nurses for information work in hospitals.Goodman questions how relationships between people would beaffected if nurses could easily access medical or allergy records atthe bedside of patients or as they move around the hospital. Howwould this type of integrated mobile information change the waypeople relate to information, space and time?

Goodman discussed specific tools that would make importantthings easier for users. She used online dating as an example,whereby if you have maps with chat integrated into them, userscould determine where they might meet up. She later comparedthis to the work done in libraries wherein users are matched upwith books and suggested that perhaps there might be a way ofconnecting users with similar interests to each other. Goodmanstated that libraries traditionally are very stable and exist for verylong periods of time; there exists a visceral trust in institutions andlibraries, such that, in general, people feel these places keepinformation safe.

From Goodman's perspective, as information becomes digital,the mission of libraries and institutions does not change. By givinglibraries more tools and technology they would fill the same role insupporting the individual user with needed information.

Goodman then discussed associations and relationships, howmobility affects various users, and the interconnectedness of usersto each other and with where they seek information. Digitalinformation opens up a vast Web of connections and Goodmanmentioned the idea of users passing information to one another, aswell as groups of users gathering information from sources such asthe “library, grocery store, movie theaters and objects.”

In the next section, Goodman sketched out some rough ideas forways libraries might use e-resources that she called, “informationscience-fictions.” She wanted to generate thought and discussionby telling stories. In the first example, she showed a tag clouddisplayed prominently on thewall of a branch library in the District

of Columbia Library System, what looked like a digital mural ofwords. The tag cloud was generated by user activity, specificallysearch words that Erin Schmidt had gathered from circulation datacomprised of 17,000 searches conducted over a period of four days.The display it created was not only visually interesting, but alsoallowed users to exchange ideas and make visible similarities,connecting them in a new and interesting way. In the secondexample, Goodman talked about “patron matchmaking” andintroducing patrons to each other based on borrowing habits.She stated that there was a case to be made for sharing thisinformation and that libraries could look into setting up a systemwhere, if users agreed to share, they couldmeet others with similarintellectual interests. The third example was the most complex. Toillustrate it she used a project currently in progress at UC Berkeleyin which they are enhancing digital texts with relevant resourcesvia point-to-point links. For example, a name within a text mayconnect users to pictures or biographical informationwhile a placename may connect users to maps, etc. Replicating serendipity isdifficult but allowing users to branch out from a starting pointallows the possibility of discovering materials by chance. Inaddition, allowing users to contribute and creating more refer-ences for future users would create interesting trails and lead usersto more informationwith various navigations. This idea also opensthe door for encouraging users to look at similar materials fromdifferent perspectives.

Goodman proposed the interweaving of electronic informationwith the traditional library services that people have come to trustto create a “digital building”; libraries could capitalize on the trustusers have in them to allow exploration and connections betweentheir tools and technology to generate new connections andrelationships for users.

In Perpetuity: Institutional and ImplementationChallenges with Electronic Resources Librarianship

Lisa Sibert and Julia Gelfand (University of California, Irvine) beganthe session with a summary of personnel changes in e-resourcemanagement at UC-Irvine from 2000 to the present. The authorsassert that electronic resources are analogous to children sitting atdinner tables apart from their print adult counterparts. However,they suggest that e-resources are maturing and are increasinglybecoming the mainstream in collections. Libraries are makingbudgetary and staffing changes in support of this trend.

The authors presented results of a survey conducted by UC-Irvine staff from December 2008 through January 2009designed to take the pulse of the e-resource profession today.The survey was posted to a number of electronic discussion liststo solicit feedback from staff in all types of libraries, but mostrespondents were from the academic library community. Thesurvey examined a wide range of e-resource areas and issues,including budgeting trends, e-resource management adminis-trative structures, e-resource duties perceived as most impor-tant, usage statistic and licensing trends, and e-resource trainingpractices.

Results show that 47 percent of respondents' library materialsbudgets are currently allocated to electronic resources. This is animpressive 14 percent increase over the past three years. However,the presenters were surprised that the figure was not higher.Electronic books currently comprise the smallest portion oflibraries' e-resources budgets, while the majority of e-resources'budgets are allocated to serials and databases. However, respon-dents indicated that their e-book budgets are growing.

Thirty-three percent of respondents report that they haveneither an electronic resources librarian nor an electronic resources

Blythe / Serials Review 35 (2009) 170–192

175

Page 3: 2009 Electronic Resources & Libraries Conference

unit. In such cases, the duties are generally integrated with acollections position and are scattered across other departmentswith little coordination. On average, three to five staff have e-resource management duties; two to three of the positions areprofessional, while two are typically paraprofessional.

The top five e-resource responsibilities reported are ordering,activating, and claiming; licensing; e-resource selection; technicalsupport for access issues; and maintenance of the OpenURL linkresolver. Duties are fairly evenly divided between acquisitions,serials, and other departments (e.g., collections, reference, sys-tems, and technical services).

Most respondents actively collect usage statistics throughvendors' reports or by gathering the statistics themselves fromvendors' Web sites; only 19 percent of respondents use SUSHI(Standardized Usage Statistics Harvesting Initiative) with an ERMS(electronic resource management system) to gather usage statis-tics. The presenters expect that number to increase in the future.Not surprisingly, themost common uses of usage statistics data areto assess cost per use, make collections decisions, and assesslongitudinal value. Some unexpected uses include determininginstructional needs, supporting accreditations, and determiningstaff training priorities.

The presenters noted that while licensing is a major responsi-bility, a large number of respondents do not follow best practices inlicense negotiation. Less than 25 percent of respondents have amodel license agreement, and libraries often agree to a range ofterms that are inconsistent across agreements. More than 65percent of respondents did not indicate which positions in theirlibraries have signatory authority.

Fifty-seven percent of respondents use an ERMS, and thosedeveloped by Innovative Interfaces and Serials Solutions are themost commonly used. ERMS are most often used to manageholdings; manage licenses and license display; manage e-resourcelifecycle and workflows; manage renewals; collect statistics; trackcontracts; troubleshoot technical issues; manage trials; andprovide support for local funding decisions.

Most librarians with e-resource management responsibilitiestrain in-house or at conferences. Methods of training werereported as on-the-job; discussion with colleagues; self-taught;professional reading and networking; and using e-resourcesguides such as the UKSG (United Kingdom Serials Group)guide.1

Perpetual access was the most frequently reported challengein e-resources. Also highly rated were the sheer volume ofwork; the large number of resources; issues in holdingsmanagement; gathering and interpreting usage statistics; licen-sing issues; and managing change. The presenters believe thatexpanded ERMS implementation will reduce the prevalence ofmany of the issues.

The presenters reported that one of the primary findings fromthis survey is that e-resources are still treated as being special ordifferent. While e-resources are still considered the newcomer inthe library world, this attitude is rapidly eroding as e-resourcesbecome more prevalent.

Based on their survey results, the presenters predict fourthemes emerging in the future:

1. Growth of e-resources will continue and will outpace printresources in all disciplines. User-centered focus will remain theultimate goal;

2. Usability and access will drive success for e-resources;3. Libraries are increasingly becoming format-agnostic; and4. Training will become a larger part of library school curriculum

as e-resources become a more central focus.

Anecdotally, the presenters also found the following trends ine-resources:

• Demand for perpetual access models is growing, and demandfor simplified and/or standardized licensing is growing;

• IT and systems departments are becoming more involved withthe increase in mobile access;

• The role of consortial arrangements continues to increase inimportance;

• Library schools need to better train with distinction in digitallibraries and e-resource management; and

• Reference outreach will become even more critical as e-resources grow in the future.

Metadata Cross-walking, Data Quality, andSemantics: Repurposing MARC Records for DigitalCollections

Lucas Mak's presentation described a project at Michigan StateUniversity (MSU) to create records inQualifiedDublin Core (QDC) byderiving data fromMARC records for digital images of Sunday SchoolBooks historical collection. The collection contained nineteenthcentury publications by religious societies and consists of 170 titles.Every title in the collection has a MARC record in the university'sOPAC but the goal is to include the titles in the Digital AssetsManagement System.With thehelpofMarcEdit software and theuseof cross-walks and data mapping, MSU embarked on a project tocreate a set of records (to be output in XSLT) for its digital library.

The process of repurposing metadata presented a number ofchallenges for the project team. For example, repetitive informa-tion in the MARC record's fixed and variable fields, notes, andaccess points created duplication of information in the output XSLTrecords. Conversely, the absence of detailed information in theMARC records lumped the data into the wrong fields of QDC. Suchwas the case with the relator code that was absent in name entries,which made all names merge under the label “Creator,” regardlessof the person's role. Overlapping but not duplicative data, such aslanguage information in the fixed field (008) and variable field(041) also complicated cross-walking. Some data, such as thatcontained in the 490 fields or title source notes, could not bemapped to any DC elements at all and thus were lost. AACR2/MARC syntax for formatting questionable dates (e.g., 18–?) alsopresented a problem because expressing them as [between… and]affected sorting of the records. The separation of MARC recordsinto serials and monographs added a layer of complexity aspublication dates are recorded in 260 $c for monographs but in the362 field for serials. Summarizing the difficulties of cross-walkingMARC elements into QDC, Mak especially noted such issues as thedifferences of data structure and semantics between MARC andQDC, catalogers' mistakes in the source records, and inconsistentapplication of MARC fields. All these problems hindered cross-walking results. Reflecting on the lessons drawn from this project,the speaker pointed out that not all mistakes and inconsistencies inthe source data are worth the time correcting them, that thereshould be a balance between correcting through XSLT andimprovement of source data. In the end, the quality of sourcedata is of utmost importance; as Mak succinctly put it, “garbage in,garbage out.”

Shelflessness as a Virtue: Preserving Serendipity inan Online Reference Collection

Lyle Ford (acting head of reference) and Lisa O'Hara (acting head ofcataloging) both at the University of Manitoba Libraries (UML),

Blythe / Serials Review 35 (2009) 170–192

176

Page 4: 2009 Electronic Resources & Libraries Conference

presented “Shelflessness as a Virtue.” JaredWhiklo,Web developerfor UML, was on hand via chat for technical questions after thesession. Their project originated from an interest in creating anonline reference collection by pulling together their electronicreference books from across vendors and incorporating browsa-bility, serendipity, and Web 2.0 tools to make their e-referencematerials more discoverable to users. Ford began with the onlineOxford English Dictionary's definition of serendipity, “The faculty ofmaking happy and unexpected discoveries by accident. Also, thefact or an instance of such a discovery.” He went on to discuss thenature of use of reference materials; generally reference titles arenot the type of information users search for by specific title, thesecollections lean toward being useful by the act of browsing. Hepointed out that the current library catalog permits some limitedbrowsing but that in most cases users would really need to knowspecifically what they were looking for rather than coming acrossmaterial serendipitously. The group did an environmental scan ofwhat other libraries were currently doing to provide access toelectronic reference materials. Library literature they gatheredsupported the theory that print reference materials were overallnot being used very much. Though many platforms providedaccess to electronic reference collections, such as Springer andOxford, because they were often buried within the platform theirusage statistics indicated that usage was low. They had alsodiscovered that 41 percent of ARL libraries did not have e-referencecollections and none of them included Web 2.0 features such asbook covers that would allow browsing; none of the ARL librarieswas doing quite what UML was hoping to do. The challenges thegroup identified included three separate departments with varyingpriorities, lack of time overall, and lack of Web space.

The project required the programming skills of Whiklo to getthrough the various steps; in summary, pulling MARC XML recordsfrom the library catalog into an indexer for parsing, then onto aWeb server. These steps taken in conjunction with collectingvendor book covers and Library of Congress classifications arewhat made up the final resulting electronic reference collection.Using Solr as the index and Cocoon as the framework allowed themthe flexibility, hierarchy, and structure, alongside the functionalitynecessary for making browsability and serendipity possible.

Aside from the book covers, the other Web 2.0 tool the groupworked to make available was tagging. The challenge they facedwith allowing user tagging is how to limit abuse and irrelevant or“useless” tags. To assist them, they employed a blacklist andallowed themselves flexibility in monitoring the user tags withoutobstructing users. The group showed a prototype of the project,which may be viewed at http://libdevl.lib.umanitoba.ca/eRefer-ence/ (accessed March 10, 2009). The group now has a prototypebuilt and functional, and they have been working with their UMLcolleagues to gather feedback, which led to producing the nextsteps for the project. The next steps included implementation,further testing and feedback, naming the collection, catalog cleanup, adjustments to classification to match local subjects andeventually making it public to both staff and users.

Standards in the E-resource World: COUNTER,CORE and I2

In this session providing an update on the existing and emerging e-resources standards, COUNTER, CORE and I2, John McDonald(Claremont University Consortium) outlinedmajor new features ofCOUNTER's Code of Practice 3. They are

• Journal Report 1a now provides statistics for backfile usage;• XML-only reports are now available;

• An auditing component has been added;• The reliability of reports has been improved through themitigation of inflationary effects; from federated searching,automated search engines, bots, etc.;

• Consortial reports have been improved; and• SUSHI support has become a requirement for compliance withCOUNTER.

McDonald represented the future directions of the standard:

• Continued audit/compliance procedures refinement; betterrepresentation of new media (e.g., e-books);

• The representation of new communities, such as museums andart galleries; and

• A focus on user-centric statistical standards to measure whatusers do with the content they access.

Jeff Aipperspach (Serials Solutions) addressed the developmentof Cost Of Resource Exchange (CORE), a new standard that wouldfacilitate the exchange of acquisitions information betweenintegrated library systems (or business systems), ERMS, vendors,agents and constortia. The development of the standard began in2008 with a goal to define financial data that can be used with anyapplication. As single ILS (integrated library system) dominance isreplaced by the multi-vendor environment, the data need to beshared, not duplicated, between those systems. The existingSUSHI standard is not the optimal solution because it has toomuch data. Additionally, as opposed to SUSHI, CORE is designed toprovide the real-time updates. The initial forces behind thedevelopment of the new standard were Aipperspach, Ed Riding(SirsiDynix), and Ted Koppel (then Ex Libris, now Auto-Graphics).They surveyed vendors to determine feasibility, picked CORE asthe acronym, and in spring of 2008 approached NISO with theidea of the CORE standard. The development of CORE has beengeared towards defining the data, not the application, and isintended to not duplicate existing standards, such as ONIX SOH.The working group intends to issue the finalized draft of CORE bythe end of 2009.

Tina Feick (Harrassowitz) described the emerging InstitutionalIdentifiers, or I2, standard. Based on the premise that there arenumerous institutional identifiers for different purposes (financial,membership, etc.), the standard aims to provide a universalidentifier for each institution. I2 would bundle information aboutthe institution in an accessible form that can be used at variousstages of the journal supply chain. The current difficulties indeveloping this standard include identifiers not being interna-tional, delineating consortial hierarchies, and others. The I2standard can be used by publishers, libraries, in COUNTER reports,among other applications. During the question-and-answer ses-sion Feick described the final product as a form of registry, whichwill be supported by a maintenance agency.

E-resource Statistics: What to Do When You HaveNo Money

With library budgets being cut, tracking online usage stats toevaluate collection use becomes amore frequent and important task.It is especially cumbersome when you don't have a commercialproduct to do it for you and have to gather numbers manually. MaryWalker (electronic resources librarian, Wichita State University(WSU) Libraries) spoke about the issues of tracking electronicresources, what WSU tracks and how cost per use is calculated.

WSU tracks online usage for journals and databases usingextensive spreadsheets. Database statistics are collected and

Blythe / Serials Review 35 (2009) 170–192

177

Page 5: 2009 Electronic Resources & Libraries Conference

recorded monthly, which is a very time-consuming process:although not every database is tracked, the data take from fourto eight hours to collect, and eight days to record. Walker sharedformulas used to calculate average monthly cost per use and costper year. E-journal statistics are recorded biannually and take eighthours to collect, four hours to process and a staggering fifty-sixhours to record. Walker stressed that it was important to recordprint and online ISSNs tomatch vendor title, which, as we all know,may vary. The e-journal usage stats spreadsheet contains about9000 lines and a mistyped title can be a problem. Walker alsoshared the address of where the stats are stored: http://library.wichita.edu/colldev/dbstats.htm.

Beyond Federated Search: The Next Generation ofInformation Discovery

Tracy L. Thomson-Przylucki (executive director, New England LawLibrary Consortium (NELLCO)) presented a session on an alter-native to federated search, an open source discovery enginedeveloped for the consortia with a grant from the Institute ofMuseum and Library Services (IMLS). Thomson-Przylucki openedthe program with the brief history of NELLCO. The regionalorganization was founded in 1983; by 2003 it had expandednationally and in 2008 became international. NELLCO now hasmore than one hundred members.

The main part of the presentation addressed the need forfederated searching capability and the failure of current marketsolutions to satisfactorily meet that need. In describing why somee-resources remain underused, Thomson-Przylucki comparedpatrons' behavior in the “information mall” with shopping at themall: they only visit “anchor stores,” such as Westlaw, Lexis andGoogle, missing out on boutique offerings that provide valuableresources and are paid for by their libraries. In order to steerscholarship to relevant important resources, unify disparate silosof information, and justify budgets, users need the ability to searchacross all information resources. However, a working group atFranklin Pierce Law Center (FPLC), charged with finding the bestexisting federated search engine, determined that the currentexisting solutions are not acceptable. Among a few drawbacks ofthe present federated tools the speaker noted are excessive andunnecessary vendor server traffic, mistranslation of searchqueries, lack of results ranking, problems with de-duping, andskewed usage statistics. FPLC's vision of searching across allelectronic content, keeping the librarian involved in collectiondevelopment, and leveraging NELLCO's vendor relationships,required a new solution. The answer came from the enterprisesearch technology, in which a query is run against the createdindex but the results are linked to content, not the index. Thus,funded by an IMLS National Leadership Grant, NELLCO is in theprocess of developing Universal Search Solution or U§. Theconsortia was awarded $384,000 over two years, ending onNovember 30, 2009, to build a single-search discovery tool forlibrary print and e-resources (OPAC, subscription databases, freeInternet content, local content, etc.), which will function as analternative to federated search. A software company, IndexData,provided the open source development.

In conclusion, the speaker provided a demonstration of the testversion of the Universal Search Solution by querying the index andthen linking from the search results to content. Current vendorsthat provide data for populating the index include Lexis, Wilson,Westlaw, Gale, MyiLibrary, Readex, and many others.

Thomson-Przylucki explained that as part of the consortialagreement all NELLCO members are entitled to take advantage ofthis custom discovery tool at no cost but under obligation to

designate a project manager, expose OPAC data for harvesting, anddo marketing and evaluation of U§.

Let's Stop Talking about Institutional Repositories:a Study in Perceived Use-Value, Communication,and Publishing Services

This session's bold title underscores Catherine Mitchell, MatthewWinfield, and Elise Proulx's purposeful de-emphasis of the term“repository.” The word has been deliberately removed from theCalifornia Digital Library's eScholarship Publishing Program's titlebecause its ambiguous connotations do not adequately representthe program's suite of publishing services.

The speakers began by posing the question, “What makes aninstitutional repository successful?” Discussions about successfulversus unsuccessful repositories often focus on quantitativemeasures. In this sense, the eScholarship Repository would appearto be an unqualified success. Launched in 2002, it houses 26,000open access papers, with nearly eight million downloads. Thefigures sound impressive, except that 26,000 papers are publishedevery year on the collective University of California campuses.

The speakers identified possible impediments to the reposi-tory's success. Facultymembers familiar with the service like it, butmany potential users are not aware of the program, much less theservices it offers. There is no clear incentive for faculty to publishtheir works in the repository. Some faculty are open accessadvocates, but for others, using a repository only adds an extra taskto their publishing process. Further, lack of knowledge about peerreview processes and copyright principles serve as unfortunatedisincentives for less experienced faculty seeking tenure andpromotion. Those faculty who post research to disciplinaryrepositories or personal/departmental Web sites may not see abenefit to also publishing their content in an institutionalrepository.

In response to these concerns, the California Digital Librarydecided to reinvent the service and eliminate the term “repository”altogether, rebranding the service as the eScholarship PublishingProgram and defining a new focus on publishing services andbenefits offered. A new partnership with the University ofCalifornia Press enabled expansion of publishing services andincreased the program's marketing capabilities. A massive market-ing campaign raised awareness of the service to groups andindividuals throughout the UC System. Faculty were reassured thattheir publications are still cited in the professional literature andthat, in fact, some open access publications are cited morefrequently than traditional publications. Newly available freesetup training and publishing support eliminate the need forspecialized knowledge or skills to publish content. The program ispromoted as a dependable, permanent publication archive, incontrast to some other content publishing and/or deposit options.

Along with a new logo that prominently identifies theUniversity of California, the program's Web site has beenredesigned based on user feedback with a cleaner, simpler designthat enables more robust searching, access, and publishingfunctions. The site emphasizes the publishing services offeredand attracts users' attention with elegantly highlighted newcontent. The designers sought to form a more cohesive collectionfrom the separate collections, while providing a broader contextthat represents the full breadth and depth of available content.New filters help users instantly expand and narrow search results,and improved content and citation viewing options enable users tomore quickly and effectively determine content relevancy. Notably,PDF documents can be keyword searched using the site's search

Blythe / Serials Review 35 (2009) 170–192

178

Page 6: 2009 Electronic Resources & Libraries Conference

interface. For users linking directly from Google, a new PDF Webframe that resembles the eScholarship Web site helps properlyassociate documents with the repository and lend them legitimacy.The site is scheduled for release in fall 2009.

Remaking the eScholarship Publishing Program also involvedforming a strategic partnership with the University of CaliforniaPress to meet mutual and distinctly separate business needs. Thetwo entities have different business models, goals, constituen-cies, and cultures, but they share the same need for infra-structure, resources, technical support, and distribution channels.The new partnership has enabled the eScholarship Program to“move away from the highly wrought and narrowly focusedprojects undertaken in the past” and instead focus on sustainableservices. They now jointly offer services such as electronic andprint book and journal publication; electronic pre-print andpost-print dissemination and persistent access/storage; high-quality book manufacturing from clients' PDFs; and peer reviewmanagement. The partnership supports both open access andprint sales business models. The eScholarship Publishing Pro-gram is now poised not only to earn a reputation as a serviceprovider, but to gain associative legitimacy as a publisher inconjunction with UC Press. With the recent improvements andexpansion of services, the eScholarship Publishing Programhopes to measure its success not only in the number of itemsto which it provides access, but in the quality of the collectionsand the services designed to support academic publishingactivities.

Creating and Maximizing the Use of Usage Stats

Smita Joshipura and James Carvalho (Arizona State Libraries)discussed the goals theyhad set out to accomplish tomake themostof usage data. They wanted to generate a step-by-step process forcreating a static online usageWeb site andmaximize the use of theusage data they gathered. There were various challenges includingthe time-consuming, tedious and demanding process of collectingand disseminating usage data. Despite the challenges, theyacknowledged that usage data are critical, practical, informative,and necessary to assist librarians in collection developmentdecision making.

In 2007 they implemented an A-to-Z usage dataWeb site whichstarted with 200 items including packages and titles and grew to370 items. They also identified whether the data were COUNTER-compliant for each item listed. Essentially, the A-to-Z usage Webpage was generated by gathering and storing usage data, creatingvarious directories and subdirectories, and using DOS commandsto automate list generation. Overall, the whole process required aminimum use of tools including Microsoft Excel, Notepad, and asecure FTP.

Joshipura and Carvalho performed a user survey once the usageWeb site was implemented. The value of their tool was evidencedby the number of uses it received and the positive response ratethe presenters received. However, while the Web site did assist inthe decision-making process, selectors wanted to see more data;for some, the data were not adequate to make a decision.Additional feedback given was that explicit definitions for non-COUNTER-compliant data must be incorporated, along withregular updates of the data and cost information.

Looking to the future, Joshipura and Carvalho hope to be able tohost the data within an ERM. In March of 2008 they implementedan ERM, populating resource and contact information. The secondphase is to include coverage, usage and license information.Unfortunately, processing usage data in the ERM is also achallenge; manually processing the usage data requires converting

various formats into XML and there are few SUSHI compliantproviders as well.

Composing Rock 'n Roll Stories from the UsageData Blues

Jamene Brooks-Kiefer (resource linking librarian, Kansas StateUniversity) started by setting an entertaining tone for thepresentation by playing blues music from her laptop as attendeeswalked in. She explained that though there are challenges todealing with usage data, the session was not going to addressadding staff, learning SQL or Access, or, as she put it, “feeling sadand sorry.” Maintaining a more positive approach, Brooks-Keiferwent on to relate that the main focus of the sessionwas inworkingwith what is available, using current usage data, improving skillswith familiar tools, “creatively persuading stakeholders,” and usingthe data to solve problems and tell stories.

Brooks-Kiefer discussed the current state of data in general:organizations gather various forms of data, librarians andstakeholders want decisions to be based on the data gathered,but that, overall, there are still many data problems that need to besolved in order to make this system work. She acknowledged thatmany organizations have too much data coming from too manysources, there isn't enough staff to process all the data, and no oneis interested in what the data says.

Brooks-Kiefer gave three solid solutions to these various dataproblems and suggests a “rock 'n roll alternative.” The solutionsinclude working with smaller portions of data, learning tomanipulate the data using Microsoft Excel, and presenting thedata in interesting ways through stories. Brooks-Kiefer advisedthat data chosen to work with should be manageable, collectedconsistently, focused on a specific activity, and, preferably, the datashould span several years. She stressed that it was critical that thedata chosen must be interesting and accessible to the personworking with it. Some examples she provided of data to work withare service desk transactions; checkout/browse transactions; gatecounts; link resolvers; content providers; federated search tools;A-to-Z lists; and, proxy servers.

In dealing with the data chosen, Brooks-Kiefer recommendedacknowledging limits and recognizing abilities, in addition toemploying Excel. In regards to acknowledging limits, she statedthat not all of the organization's data can be managed by oneperson and that, particularly at the start of all data projects,assistance is necessary. In addition, tying the data analysis in someway to the organization's strategic plan as well as publicizing theresults benefits everyone. Next Brooks-Kiefer discussed herthoughts on the use of Microsoft Excel over other methods. Excelis used in nearly all organizations at all levels of staffing, has goodsupport with little to no cost, and is “feature-rich” and versatile.Subsequently, she gave a demo of some of the features andversatility she was talking about and showed that with minimaleffort it was possible to get Excel to do some of the processingworkrather than by hand.

Brooks-Kiefer discovered that telling stories by using the datagathered was a muchmore effective way of conveying informationto others than simply showing a spreadsheet. Stories provide astrong medium for getting important messages across and becausecontext is included, it is easier to convey exactly what the data areshowing and allows the person telling the story to add emphasis toareas that are more critical. The advice she gave in creating a datastory was to write what you know, stay focused and keep it shortand meaningful. As for what to write about, Brooks-Keifersuggested a variety of ways to pick a topic and established thatanswering often-asked questions, disseminating interesting or

Blythe / Serials Review 35 (2009) 170–192

179

Page 7: 2009 Electronic Resources & Libraries Conference

crucial information, new discoveries, or stakeholder requests areall good areas to write about. She confirms that data stories areeverywhere and recommended finding out what patrons areasking and seeing if patterns occur. Brooks-Kiefer encouragedattendees not to wait for the organization, but instead to lead byexample and start employing her strategies and tell data stories.She concluded with the message, “Convince yourself to act, youractions will persuade others.”

Embedding Librarianship in Content ManagementSystems: Examples with Sakai

Jezmynne Westcott (science librarian, Claremont University Con-sortium) talked passionately about embedding their library inSakai, an open source content management system, thus creatingembedded librarians who are integrated into the community.Students and faculty are using course management systemsalready, and it is a logical step for libraries to “meet” them there,to make the library be one click away and to take the library out ofthe building. Westcott also pointed out that Sakai is great fordistance learners.

In Sakai the librarian can become a class participant, which isbeneficial to at-risk students. Librarians can see language barriers,students' needs, and information that can aid in the developmentof library instruction plans. Sakai lets users post Camtasia tutorials,and other tests and quizzes, export glossaries (of library jargon, forexample), and is great for sending announcements. Another greatfeature is the ability to embed IM services, such as Meebo, in orderto chat with patrons.

“Just in Time” in Difficult Times: Lessons forLibrarians

Vicky Reich (director and co-founder, LOCKSS Program, StanfordUniversity Libraries) started her session by warning her audiencethat her talk would not be about LOCKSS or CLOCKSS and that shewas taking a risk with this new presentation. Instead, Reichdiscussed the burning issue of digital preservation and the need onthe part of libraries to start thinking and doing something about it.According to Reich, for years we have been “abandoning just-in-case collections for just-in-time access.” When a library cuts 10percent of its periodical budget, all libraries are impacted. Withpaper subscriptions, there are several copies at UCLA, Stanford andHawaii, for example, but the same cannot be said with digitalsubscriptions. Everyone is counting on one digital copy. If one ofthe more than 900 existing publishers of online content cannotsupply that content any longer, Reich suggests that libraries mayhave nothing left. Libraries must act to prevent that meltdownfrom happening, and wemust think about long term online access.Reich then talked about LOCKSS, how they have plenty of storageon their server and will work with anyone to preserve any format.

This was an exciting and interactive session with attendeessharing their stories and experiences dealing with publishers,online licensing clauses and preservation efforts. Alabamalibraries, for example, implemented a private LOCKSS measure.Libraries inform and educate their patrons and have an obligationto keep the “memory” alive.

Electronic Resources to Go

Kristine Ferry, Holly Tomren and Lisa Sibert (University ofCalifornia, Irvine (UCI)) delivered an eye-opening session oncutting-edge library services. They talked about the barely chartedterritory of tools available to patrons with mobile devices, which

make it possible to access library resources. The number of usersaccessing the Internet via mobile devices is steadily increasing. Inthe US alone there are 260 million mobile device users and thenumber is expected to reach 4.9 million globally by 2012.Predictions indicate that mobile devices will be the primaryInternet connection tool for most people in the world by 2020.

The presenters next shared various examples of what have beendone so far to address this trend. The Arcadia Project at theUniversity of Cambridge is addressing the role of academiclibraries in the digital age, specifically library services for mobiledevice users. Ball State University libraries provide a mobile sitethrough which their patrons can connect with library resources.Britannica has a mobile iPhone edition, Google Books offersapproximately 1.5 million books for iPhone download, and ProjectGuttenberg, ArXiv and PubMed make their content available, aswell. Finally, Zotero, a bibliographic citation manager works on apersonal data assistant (PDA) platform. At UCI, patrons can text alibrarian or visit a PDA Resources library page at http://libguides.lib.uci.edu/pdas. Some of the resources available to UCI librarypatrons using handheld devices are PubMed and Epocrates RX.

The challenges associated with providing library services formobile devices were given as differentiating between electronicresources that are downloadable to smartphones and electronicresources that are accessible through library authentication, butoptimized for smartphone viewing; determining whether thelibrary or IT is responsible for support; developing ways to gatherusage statistics; and, encouraging vendors to optimize moreelectronic resources for mobile devices and working with mobiledevice creators to improve the view quality of PDFs.

Managing Freely Available E-resource Collectionswith Today's Vendor-provided OpenURLKnowledge Bases: A Challenge in Quality Control

Chad Hutchens (University of Wyoming) and Michael Bloomberg(Augsburg College) observed that link failure rates for freelyaccessible journals tracked in their OpenURL knowledge bases(SFX and Serials Solutions' 360Link) appeared to be higher thanthose of their paid e-resources. During the session, they presentedresults of their 2008 survey, conducted to determine librarians'level of awareness of the issue and to gauge libraries' patterns ofactivation and access for freely accessible and open access e-journal collections tracked in commercial knowledge bases. Lastly,the authors presented the findings of their link-checking research.

Prior to conducting the survey, the authors identified andquestioned the validity of some common assumptions about freelyaccessible collections: 1) knowledge base metadata are as accurateas that of subscribed collections; 2) knowledge base vendors useinclusion criteria to select titles; and 3) such collections are, in fact,comprised of freely accessible journals. The survey focused on therelated issue of librarians' attitudes and behaviors toward suchcollections. Survey respondents were overwhelmingly academiclibrarians who use SFX or 360Link and most often activateDirectory of Open Access Journals (DOAJ) and Highwire Press FreeJournals (88 percent and 84 percent, respectively). Other collec-tions comprised of freely accessible journals as grouped byknowledge base vendors (proprietary free titles) were a closethird. Half of libraries surveyed activate freely accessible titles bythe entire collection, while the other half select on a title-by-titlebasis. When asked their attitude regarding free collections, mostlibrarians responded, “Why not activate these resources? We wantour users to be able to access as much content as is available.”

The concern is that libraries will “set and forget” freelyaccessible collections—that is, activate an entire collection and

Blythe / Serials Review 35 (2009) 170–192

180

Page 8: 2009 Electronic Resources & Libraries Conference

fail to monitor content and coverage changes over time. Overall,the survey data show that librarians are aware that free collectionscan be problematic and unstable. Some libraries only selectivelyactivate the collections or avoid activating them altogether becauseof these known issues.

The authors used the link checker, LinkLit (www.linklit.org), totest the failure rate of the relatively cohesive DOAJ collection ascompared to themore loosely grouped proprietary free collections.DOAJ's error rates in 360Link and SFX were similar, with about 4percent and 7 percent failure, respectively. The proprietary freecollections' error rates were significantly higher, with about 20percent and 11 percent failure, respectively. The authors alsomanually checked random URLs and found that dates of coveragewere larger than reported in approximately 40 percent of thesample.

The authors conclude that most of the common assumptionsthey identified are incorrect. Instead, they discovered that thereis a lower level of maintenance by vendors for free collections ascompared to paid subscription databases, and there is a lack oftitle selection criteria. In addition, not all titles are necessarilyfree or open access, nor are they necessarily journals. Althoughlibrarians are aware of the challenges inherent in thesecollections, most still choose to activate them either as wholecollections or by individual title. The authors noted thatlibrarians champion the open access movement, yet by activatingfreely accessible collections that may have higher than usualerror rates, we are not devoting sufficient resources to ensurecorrect metadata so that our patrons can successfully connectwith the research.

Note

1. United Kingdom Serials Group, The E-Resources Management Handbook, 2006,http://uksg.metapress.com/link.asp?id=6tuu9n7wfl18 (accessed May 19,2009).

doi:10.1016/j.serrev.2009.05.006

2009 North Carolina Serials Conference: Are YouReady? New Opportunities in Challenging Times

Megan Griffin

The North Carolina (NC) Serials Conference took place March 27,2009, in Chapel Hill, North Carolina. The theme of the one-day,sold-out conference was “Are You Ready? New Opportunities inChallenging Times.” While the sessions/speakers did not directlyaddress serials-related issues, the conference focused on thechallenges facing librarianship and the publishing industry andhow players in both worlds are grappling with these largerissues.

Dr. Irene Owens (dean, North Carolina Central University(NCCU) School of Library and Information Science) opened theconference, calling the attendees to come together to makelasting, innovative changes, in memory of John Hope Franklin,who had recently passed away. Carol Nicholson, associatedirector for technical services at the University of North CarolinaLaw Library and co-chair of the conference planning committee,introduced the other members of the planning committee andthanked the generous conference sponsors for making theconference possible.

Keynote Address 1: Disappearing Print and aChanging Economy, Launching New ManagementModels

John Drescher (senior vice-president and executive editor of theRaleigh News & Observer (N&O)) gave the first keynotepresentation. Drescher acknowledged the dire circumstancesfaced by the N&O, along with many other newspapers acrossthe country. The paper has endured three rounds of layoffs withina year, and the size of the newsroom staff has decreased 50percent since 2003. The harsh economic climate has forced thenewspaper to tackle challenges in two major categories: peopleand content.

Drescher stated that the first strategy in dealing with personnelchallenges is to know your staff; be sensitive to their dispositions,their workloads, and their concerns. Another strategy is to beproactive and openwith communication. Allowing time for staff toask questions and receive positive feedback about their perfor-mance and the future of the paper has not only strengthened thesense of camaraderie among N&O staff members, but it has alsoallowed Drescher to continually emphasize the mission of thepaper, which is to provide timely, accurate news to its readership.Finally, Drescher underscored the importance of optimistic leader-ship in challenging times. A leader who is upbeat and alwayslooking for opportunities in challenges is far more likely to incitehis/her employees to innovative excellence.

Another serious challenge Drescher's industry faces is a contentproblem. Likemany other newspapers across the country, theNews& Observer is caught between the print and electronic worlds. TheN&O reports record-high readership levels in both formats. As babyboomers retire, they often subscribe to the daily paper to stayconnected to current events, and meanwhile, the N&O's onlinepresence continues to grow. As more readers use the electronicversion, the revenues associated with the print version aredisappearing. Revenue from classified advertisements has beenon a permanent decline as citizens have flocked to online venues,such as eBay and craigslist, to hawk their wares. In the past the bulkof the newspaper's advertising revenue came from the automotiveand housing industries, two markets that have been hardest hitsince the economy took a downward turn in fall 2008. On the otherhand, online readership also continues to steadily increase, alongwith the online advertising revenues. Usage studies indicate thatthe number of unique online readers and page views has increaseddramatically over the past few years, leading N&O staff to offermore unique and specialized content for its online readership.

Consequently, Drescher reported that the newspaper haspursued excellence in both formats, while maintaining efficiencyin the face of a downsizing culture. Drescher encouragedconference attendees to embrace these kinds of challenges byseizing the crisis and looking for new avenues of specializedinnovation. For example, the print format lends itself to portability,longer stories, and a sequential order to the paper's layout.Meanwhile, the online format allows for searchability, interactionamong readers with the commenting feature, up-to-the-minutetimeliness, and chronic availability. To increase productivity, theN&O recently bought the company that owned a competing NorthCarolina paper, the Charlotte Observer. This action allowed thepapers to expand content, while decreasing duplication of effortsand maintaining their high-quality content. Fewer sports writersneeded to be dispatched to cover the same game, so the staff wasable to cover a wider variety of sports events. Fewer writers wereneeded to cover state politics, so more writers could be assigned tocover local or national politics as the needs arose.

Blythe / Serials Review 35 (2009) 170–192

181