University of California Law Library Using SkyRiver

A California university law library has started using SkyRiver for its cataloging services.

The University of California Davis Mabie Law Library, in Emeryville, CA, has more than 290,000 volumes of books in eight-plus miles of shelving, 150,000 volume-equivalents of microforms, and access to several proprietary online databases. In addition, a collection of more than 40,000 volumes of the basic sources of United States, British, Canadian, Australian, and international law are stored off site at the University of California's Northern Regional Library Facility.

SkyRiver links libraries with the bibliographic metadata they need to catalog library collections. Its database has more than 28 million titles and is constantly updated. Database files are supplemented by public domain metadata from SkyRiver's customer libraries and hundreds of other contributing libraries.

Annual subscription to SkyRiver cataloging services include:

  • Enabling download of data in MARC format;
  • Using a search engine with facets and tags;
  • One-click downloading;
  • Shelf-ready support, including spine labels;
  • Integration of MARC record supply from participating materials vendors;
  • Unlimited access to the SkyRiver database for searching, editing and downloading;
  • Unlimited record requesting, user license, data transfer; and
  • Unlimited access to the SkyRiver help desk.

"In addition to a high hit rate for our new titles, the database is easy to navigate, and we find records more quickly since the SkyRiver database is much cleaner than that of our legacy cataloging service," said Kathy Lin, Mabie Law Library cataloging department head.

Judy Janes, interim library director, added that switching to SkyRiver will also save the university money because the service is cheaper than its legacy system.

In June, SkyRiver announced that its customer base grew by more than 100 percent as compared to the same period last year, naming new academic library clients San Francisco State University, Escondido Public Library in Southern California and Washoe County Public Library in Nevada.

For further information, visit the SkyRiver Web site.

About the Author

Tim Sohn is a 10-year veteran of the news business, having served in capacities from reporter to editor-in-chief of a variety of publications including Web sites, daily and weekly newspapers, consumer and trade magazines, and wire services. He can be reached at [email protected] and followed on Twitter @editortim.

Featured

  • abstract generative AI technology

    Apple and Google Strike AI Deal to Bring Gemini Models to Siri

    Apple and Google announced they have embarked on a multiyear partnership that will put Google's Gemini models and cloud technology at the core of the next generation of Apple Foundation Models, a move that could help Apple accelerate long-promised upgrades to Siri while handing Google a high-profile distribution win on the iPhone.

  • network of various technology icons

    Newly Launched Agentic AI Foundation Brings Together Tech Giants for Open Source AI Development

    The Linux Foundation has announced the formation of the Agentic AI Foundation, bringing together Microsoft, OpenAI, Anthropic, and other major tech companies to advance open source development of autonomous AI systems.

  • glowing brain above stacked coins

    The Higher Ed Playbook for AI Affordability

    Fulfilling the promise of AI in higher education does not require massive budgets or radical reinvention. By leveraging existing infrastructure, embracing edge and localized AI, collaborating across institutions, and embedding AI thoughtfully across the enterprise, universities can move from experimentation to impact.

  • AI word on microchip and colorful light spread

    Microsoft Unveils Maia 200 Inference Chip to Cut AI Serving Costs

    Microsoft recently introduced Maia 200, a custom-built accelerator aimed at lowering the cost of running artificial intelligence workloads at cloud scale, as major providers look to curb soaring inference expenses and lessen dependence on Nvidia graphics processors.