Monday, May 22, 2017

The Huffington Post: After Trump Was Elected, Librarians Had To Rethink Their System For Fact-Checking

The American Library Association wants to help you distinguish real news from fake with the help of CRAAP.

By Maddie Crum
March 9, 2017

If you’ve been a student in any capacity since the advent of the internet, you’re probably aware of the stigma around citing online sources in research papers and other academic pursuits.
Teachers and librarians have had to reconcile student interest in online sources ― and the relevancy those sources have to their lives ― with the fact that in the past, sites haven’t been as rigorously fact-checked as published books.

To help students take a clear-eyed approach to internet research, librarians like American Library Association (ALA) president Julie Todaro use a resource called the CRAAP test, created by Meriam Library at CSU at Chico.

A widely used information evaluation system, CRAAP stands for currency, relevance, authority, accuracy and purpose. According to the CRAAP test, a 20-year-old article written by a PR firm, for example, would be less valid than a three-year-old statement made by an American president in a published memoir.

But now, due to President Donald Trump’s Twitter comments dismissing legitimate sources of information, including multiple attacks on The New York Times, the ALA is making some changes to the test’s criteria.




“We have standards for assessing news, and we had to go back in and change those,” Todaro told The Huffington Post in a phone interview. “We’re looking at having to flip what we’re talking about, taking a look at how many people said this, where they said it, what the statement was.”

Todaro and her team have worked to develop an update to the CRAAP test, where the “authority” component is more closely considered. “We have to talk about authority today and we have to have them not make the authority decision without the set of other facts like accuracy and currency,” she told Texas Standard.

“We talk differently about authority [now],” Todaro reiterated to HuffPost. “And we talk about credentials in a different way. We talk about going beyond a title that someone has.”

The CRAAP test is often applied to scientific or historical information, Todaro said, citing erroneous claims about the nonexistence of global warming or the Holocaust as examples of CRAAP-tested statements.

Tweaking the CRAAP test is just one way librarians are pivoting to meet the needs of citizens under Trump’s administration. In addition to helping readers access books, librarians are flexing their roles as community organizers and distributors of accurate information on immigration, trans rights and other issues, which Todaro describes as civil rights issues.

“Libraries aren’t partisan organizations. So it doesn’t matter how you voted, it doesn’t matter where you’re from. We can provide resources and services for everyone,” Todaro said. “We’re having to, sadly, take another look at the standard credibility that you and I, and children and adults everywhere, have taken for granted for years. That’s no longer there.”

Editorial note: The Toronto Public Library has created a "How to Spot Fake News" in a Canadian-context. 

Sunday, May 21, 2017

Oxford University Press Blog:How libraries served soldiers and civilians during WWI and WWII



By Katie D. Bennett
May 8, 2017

Essentials for war: supplies, soldiers, strategy, and…libraries? For the United States Army during both World War I and World War II, libraries were not only requested and appreciated by soldiers, but also established as a priority during times of war. In the midst of battle and bloodshed, libraries continued to serve American soldiers and citizens in the several different factions of their lives.

“Their purposes reached far beyond housing a collection of books,” explained Cara Setsu Bertram, Visiting Archival Operations and Reference Specialist at the American Library Association Archives.

During World War I and World War II, camp libraries popped up everywhere at military bases in the United States and all over Europe, stretching as far east as Siberia. These camp libraries were originally established by the American Library Association (ALA), and at the end of World War I, ALA transferred control of them to the war department, which maintains them to this day. ALA worked with the YMCA, the Knights of Columbus, and the American Red Cross to provide library services to other organizations, such as hospitals and rehabilitation centers.

These libraries were nothing glamorous—usually a shed, shack, or a hut built of wood and other available materials. They were run by librarians who volunteered to travel overseas to care for the libraries. Responsibilities included circulating the collections, maintaining them, weeding out books, and acquiring new ones. More than 1,000 librarians volunteered during World War I, and that number only increased with World War II.

“Servicemen at a Louisiana Library, circa 1942” courtesy of the
American Library Association Archives. Used with permission.
For many soldiers, Bertram explained, libraries were a place to relax, read, boost their morale, and educate themselves. Many soldiers were thinking about which jobs they wanted when they returned home toward end of the war, so they read about skills for various lines of work. For a few, this was the first exposure these soldiers had to books of any kind, and many illiterate men gained the opportunity to learn to read.

“What’s really interesting is that soldiers were interested in non-fiction, technical books,” Bertram added. “You would think they’d want to read a fiction book, something to take their minds elsewhere, but they were really interested in non-fiction.”

Despite the overwhelming interest in non-fiction, there were a few fiction books that resurfaced as favorites amongst military personnel, such as A Tree Grows in Brooklyn and The Great Gatsby. “These books were almost pulled out of obscurity and brought into popularity during WWII,” Bertram notes. “They were already classics, but they regained popularity during this time thanks to the soldiers.”

The practice of bibliotherapy gained traction during the two World Wars, as many soldiers used reading to treat PTSD, paranoia, insomnia, and other psychological disorders commonly suffered by veterans. Nurses read to injured and blinded soldiers to ease their suffering.

“US Army Hospital Ward Service, circa 1945” courtesy of the
American Library Association Archives. Used with permission.
Paranoia seeped into these libraries as well, in the form of censorship. Certain books were banned from the libraries in camps, several of them being pro-German sentiment books, pro-socialist books, and books about pacifism. The war department wanted them pulled from libraries or completely destroyed. For citizens in the United States, the military demanded that all books on explosives, invisible ink, and ciphers be removed from libraries, and that any patron who requested such materials have their name put on a list to submit to the FBI for questioning.

Even with the careful curation by high-ranking officials, ALA sent over 10 million books to the armed services camps during its Victory Book Campaign from 1942–1943. Reading materials were distributed to various military branches, such as the Army, Navy, American Red Cross, prisoners of war, and even the “war relocation centers,” a euphemism for the Japanese-American internment camps in the United States.

At the end of World War I, these camp libraries were taken over by the military, but one library in particular was created independently to serve the American ex-pats who remained in Europe after the war. ALA founded the American Library in Paris in 1920 to serve in part as a memorial for Alan Seeger, a young American poet, who was the son of Charles Seeger, a leader in the group of American ex-pats. The library was meant to be a haven for armed forces personnel serving their allies in World War I.

“One of the original missions of the library was to teach and show off the advances Americans had made in the field of library science,” said Charles Trueheart, current Director of the American Library in Paris. Library science was a very progressive subject, and the United States was far ahead of the French in that regard at the time. The library was a symbol of the United States becoming a true world power.
Photo of the American Library in Paris in 1926.
Photo “10 rue de l’Elysée” from the American
Library in Paris. Used with permissi

“Americans moved to France in large numbers to build these big institutions that they didn’t build anywhere else in the world,” Trueheart admitted.

After the First World War came the second, and the purpose of the library morphed. “It was the only library in Paris or France that had books in English and could stay open because of Vichy connection,” said Trueheart.  That connection is slightly controversial.

Throughout WWII, the library survived in large part due to Countess Clara Longworth de Chambrun. After the first director, Dorothy Reeder, was instructed to return home to the United States for her safety, Countess de Chambrun took over control of the library as the Nazis occupied France in 1940. Due to her son’s marriage to the daughter of the Vichy Prime Minister, Pierre Laval, the library was able to remain open throughout the war. The library continued to operate with occasional confrontations with the Nazis, and even though her family was tied to the enemy, she aided in the resistance. While the library remained open under the guise of being compliant with the Nazi cause, de Chambrun ran an underground book service to Jewish patrons.

The value of this library, and the other camp libraries, throughout both World Wars was immeasurable. The American Library of Paris was a lending library, where people came to read and borrow books, and it was the only library in Paris or France that had books in English and could stay open during World War II.

“There wasn’t anything like this and it was a treasure,” echoed Trueheart.

These libraries were safe havens for soldiers and civilians alike, and their existence during times of war is a true testament to the constant need for libraries.
Featured image credit: Photo of military personnel and a librarian in a camp library in France in 1919 from the American Library in Paris. Used with permission.

Source: Oxford University Press Blog

Saturday, May 20, 2017

Colorado Public Library: How Denver Public Library Balances Books And Being A Homeless Shelter

By Michael Sakas
May 17, 2017

The downtown Denver Public Library.
A visit to the library likely means checking out a book or movie. But the Denver Public Library says its central location has another job these days — it’s somewhat of a homeless shelter.

“That is a role that we have not asked to play, but are playing,” says Michelle Jeske, the city librarian for Denver.

When the doors of the library open at 10 a.m. a mix of people usually wait outside to be let in. Some have materials to return or pickup, and others are seeking shelter.

James Short, who describes himself as residentially challenged, is one of the group waiting to get in. He’s a writer, and says he comes to the library nearly every day to work. Without a home, “I’d be drinking a lot more Starbucks coffee and using their internet,” Short says.

Of the crowd gathered at the Central Library on this day, Short was the only one willing to be interviewed. One man said he was too high to talk. Another didn’t want the plasma center to know he was homeless or he wouldn’t be able to donate.

Elissa Hardy, one of the Denver Public Library's social workers, points out that the library is one of the few public bathrooms in the city. “We don't open until 10 a.m. [weekdays]. So as you can imagine, if you're leaving shelter at 5 or 6 in the morning, that's five to six hours that you don't have access to the bathroom.”
James Short, left, describes himself as "residentially challenged."
He uses the library to do his work. Pictured at right, homeless
patrons carry belongings as they wait for the library to open.

Two years ago, the Denver library didn’t have a social worker on staff. Before Hardy, she says that the Denver Library was doing the best it could. Now it’s becoming a lot more common position for libraries.

“When I started, this was the third city to get a social worker in the library,” Hardy says. “And now they are dozens around the country.”

Hardy admits she never saw herself working for a library, simply because she knew it as the place “where I could come to get my books.” But she’s here, saying hello to patrons as she walks the seven floors of magazines, newspapers, and (yes) books. The building is huge — 540,000 square-feet. In 1990, Denver voters approved a $91.6 million ballot measure to build the central library and other branch locations.

Today, Hardy says this multi-million dollar building is basically serving as Denver’s largest day shelter.

“I think that, the reason people often come here though, even though there are some other day shelter spaces, is because there are things to do. And there's resources, you can be another human in the community,” she says.

Hardy finds that most of the people who wait outside in the morning head straight to the computers on the fourth floor. That’s where some of them, like Short, do their work. Sleeping in the library isn’t allowed, but a few people appear to be nodding off at tables with their belongings tucked under their seats.

Jeske, Denver’s head librarian, says the social workers were necessary to both better serve the homeless population and to help out the library staff.

“Those of us who went to grad school to be librarians didn't go to grad school to be social workers,” she says. “And were in fact, kind of bridging that role a little bit in ways that were not necessarily comfortable for us.”

The specialized help from the library’s social workers has been beneficial, but it's difficult to find a balance between being a library for everyone, Jeske says, and helping the homeless. They don't want priorities, like children's learning, to suffer. Hardy's position is seen as a way toward finding balance.

It wasn’t seen that way at first though. When Hardy started, she “certainly heard some staff having concerns that this isn't a social service setting” or worries that more people would be invited in. That pushback has softened, and she’s now seen as a resource.


Elissa Hardy, above, is one of two social workers at the Denver Public Library. She started the position two years ago, to better serve the homeless community at the library. Mary Stansbury, bottom left, is the head of the Library and Information Science Program at the University of Denver. Michelle Jeske, bottom right, is the city librarian for Denver.

Mary Stansbury, the head of the Library and Information Science Program at the University of Denver, says a social worker role is a natural fit for a library setting.

“Public libraries have for decades have been essential organizations, not just for homeless people but also as a conduit for connecting the agencies in whatever community that library might be in, that serve the homeless,” Stansbury says.

As Stansbury sees it, libraries provide a safe place. There are security guards, places to sit where you won’t be asked to leave and you’re off the streets. She admits, universities could better prepare librarians for that environment. She hasn’t found a library science program that has a class just on how to serve the homeless. The topic is explored in an existing DU class, and faculty are considering making it a requirement.

"It's certainly one that helps students dig pretty deeply into understanding, how do I empathize with this other person that may smell bad or, won't look me in the eye?” Stansbury says.

DPL social worker Elissa Hardy gets exactly where Stansbury is coming from.

“People don't go into the field of library science thinking they're going to be working in a homeless shelter essentially,” Hardy says.

In the summer, Hardy says many people without a home travel through Denver. Often the first place they go is the library. It could be to find a book. But maybe it's to ask, where are the food lines? Where can I find a shelter? And she says, the library is here to connect people to the resources they need.

Source: Colorado Public Library

You can also listen to this story by clicking here

Thursday, May 18, 2017

Toronto Star: Toronto's radical librarians critique Little Free Library

By David Hains
May 10, 2017

University of Toronto reference specialist Jordan Hale has co-authored
 a critique of the book exchange system known as Little Free Libraries. 
The "take-a-book, leave-a-book" structures are largely located in white, affluent neighbourhoods in Toronto, study authors say, not areas most in need.

Toronto’s radical librarians do not like the Little Free Library organization.

In a study published in the Journal of Radical Librarianship, which is real, Ryerson librarian Jane Schmidt and University of Toronto reference specialist Jordan Hale argue that the neighbourhood mini-libraries don’t live up to their stated goals.

“Who could critique a little birdhouse of books?” Hale rhetorically asked Metro, adding she is strenuously pro-literacy and pro-trading-books-by-the-side-of-the-road. But their paper does just that.

“We posit that in absence of any research or evidence of an issue to be addressed . . . simply encouraging literacy in an already information-rich and privileged environment is hardly a heroic charitable act,” Schmidt and Hale wrote.

“We don’t have any issue with book swaps or exchanges,” Hale explained in an interview, adding she has obtained many excellent books that way. She is not, however, pro-Little Free Library, stating her issue is with the organization, not the idea.

The Wisconsin-based non-profit started in 2009 when Todd Bol erected a charming “take-a-book, leave-a-book” structure on his property. After his successful experiment went viral online, the organization grew. There are now 50,000 registered Little Free Libraries worldwide.

The registration fee for each library ranges from $42 to $89 (U.S.), and the organization sells pre-fabricated units for $179 to $1,254. Participants can also build their own unit and pay just the registration fee.

Hale and Schmidt mapped out the locations of the registered take-a-book, leave-a-book fixtures in Toronto. She found that they were predominantly located in white, affluent neighbourhoods and clustered in locations already well served by the public library system. Despite the organization’s stated goal, they were not located in “book deserts,” those neighbourhoods most in need.

Little Free Library also provides no-cost depots through a donor-driven fund, but Hale claims, “We didn’t see any evidence that the money was going anywhere.”

The non-profit told Metro that they have set up hundreds of units through the donor program, including 40 in the U.S. over the past eight months, and look to continue to add.

“Through these Little Libraries, millions of books are shared each year,” spokesperson Margret Aldrich wrote in an email.

Hale expressed concern that some jurisdictions turn to Little Free Libraries following cuts to full-scale libraries, but they are not an adequate substitute.

She encouraged people to support their local public library, use the community-led library tool kit and to support literacy initiatives in communities that need them most.

Source: Toronto Star

Monday, May 15, 2017

American Libraries: Top Library Tech Trends

Tech leaders recommend the tools and resources your library can adopt now and in the near future




By Alison Marcotte |  May 1, 2017

From virtual reality to gamification to security techniques, libraries are using the latest technology to engage patrons, increase privacy, and help staffers do their jobs.

American Libraries spoke to library tech leaders—members of the Library and Information Technology Association’s popular Top Tech Trends panel from the 2017 Midwinter Meeting & Exhibits—to get the apps, devices, software, and best practices that you can adopt for your library right now and in the near future.

1. Take patrons on a virtual tour

Create a virtual tour of your library using a 360-degree camera and post it to your website or social media, says Cynthia Hart, emerging technologies librarian at Virginia Beach (Va.) Public Library (VBPL). Virtual tours can be helpful for both information and accessibility.

“One of our branches is 125,000 square feet. The A’s for adult fiction are all the way at the end of the building. Can you imagine if you were a person with disabilities or if you were an older person or had low mobility?” Hart says. “If you didn’t know that when you went into a library, wouldn’t it be helpful to have that virtual tour of the building? Then you could call and say, ‘Hey, can you pull that book from the shelf?’” Virtual visit statistics can also be used as a gate count metric.

2. Make Google Cardboard sets

Augmented reality and virtual reality (VR) have become mainstream, from Pokémon Go to PlayStation VR. VR technology can be used not only for entertainment but also as a way of engaging and teaching students.

Google Cardboard is an inexpensive VR platform that allows you to visit places, play games, watch YouTube videos, or fly through outer space. Google Cardboard and VR apps—such as Proton Pulse, NYT VR, GoPro VR, VR Roller Coaster, and Titans of Space—can be downloaded on a smartphone.

To use the platform, you can buy a Google Cardboard VR viewer, which costs around $20, or you can make one. Meredith Powers, young adult librarian at Brooklyn (N.Y.) Public Library (BPL), showed teens how to build their own Google Cardboard VR cases using plastic lenses purchased online, Velcro, magnets, and cut-up cardboard boxes.

Gena Marker, teacher-librarian at Centennial High School in Meridian, Idaho, says she plans on doing the same.

“I think that not only can we introduce low-cost ways to bring these technologies in, but we can also tie that to the maker movement and teach patrons that you don’t have to spend $500 on an Oculus Rift to have a VR experience,” Marker says. “Come into the library, and we’ll show you how to take a $2 lens that we bought off Amazon and some otherwise junk cardboard to create your own VR experience.”

3. Go on a Google Expedition

Great Barrier ReefUsing a Google Cardboard kit and smartphone, students (or “explorers”) can use the Google Expeditions app to take educational VR field trips to Mars, the Guggenheim Museum, the Great Barrier Reef, and other destinations. Teachers (or “guides”) can lead students using a tablet. The tours include annotations, questions, and points of interest.

“Google Expeditions is an easy way to bring that VR experience to a library program,” VBPL’s Hart says. “Libraries can offer headsets as a part of their circulating collection.”

4. Teach with gamification platforms

Just like VR, gamification platforms and apps can also engage students at the library and are freely accessible. Kahoot, Socrative, Quizlet, and Quizalize can be used for a library orientation or class project.

These platforms can be helpful for school librarians. Marker says she does a library orientation for new high school students in the fall.

“I created my orientation questions as a Kahoot! game,” Marker says. “It changes up the format a little bit. So instead of me spending half an hour standing in front of a group of freshmen saying, ‘Do this, don’t do that. Here’s where things are in the library,’ it gives it a fun twist.”

Marker says anyone can create an account and make a game that lives on the Kahoot! website. Teachers can then use that game as a quiz, review, or pretest in class.

5. Get coding with Code School

Bill Jones, Information Delivery Services (IDS) Project creative technologist at the State University of New York at Geneseo’s Milne Library, says online coding instruction, with its ease of access and low barrier to entry, is a great trend for libraries to get involved with. Online services can also help defray the costs of tuition and textbooks.

“It’s not equal to the classroom experience,” he notes, “but it does work in terms of teaching people.”

Jones recommends Code School, which offers some free courses, as well as membership-only resources, such as interactive courses, screencasts, coding challenges, and an online community. He says he prefers it to other platforms because it modularizes the learning process.

“They really gamify the way that that the whole learning process goes so that you can see how far you need to go to complete this session but also how far you’ve gone,” Jones says. “You watch a short, five-minute video, and then you do the lesson. The hands-on tasks are right there in your browser, so it’s super easy to get right into it and start coding without having to set up any servers or even build a local hosting solution on your local machine.”

6. Make circuits with tech-loving students

MIT Media Lab has created a new user interface called DuoSkin. These temporary tattoos, made with gold and metal leaf to make a circuit, let you use your skin like a trackpad to control what is displayed on your mobile device. DuoSkin is not available yet for consumers, but Powers says librarians could explore low-budget ways to get teens involved with circuit technology. BPL has held programs during Teen Tech Week where teens create paper circuits.

“You can get some copper wires, a coin cell battery, and some LEDs, and you can make greeting cards that light up,” she says. “It’s a little lower-tech, lower-budget version of a cool tattoo, but it’s definitely something you can get for 20 bucks’ worth of supplies for 20 kids, so that’s important.”

7. Teach patrons about private browsing

Powers says that while libraries don’t keep logs of public computer sessions, patrons and library staff can further protect themselves while using public Wi-Fi with a virtual private network (VPN), which can guarantee that all the data being sent and received is encrypted and secured from others, including people on the same network or ISP.

VPNs protect your anonymity, and they don’t keep logs or discriminate against traffic or protocol types. Powers says patrons will want to research the following to decide which VPN service is best for them: if the VPN is using current security protocols; if the terms of service and privacy policies are clear and understandable; what the VPN covers and what it doesn’t; what countries the servers are located in; whether it uses its own servers; and especially how much data logging or tracking the VPN provider engages in.

“It’s also helpful to know what the company or people are like—their reputation, expertise, history, etc.—and if there’s a history of any company or founder activism that might demonstrate their commitment to consumer privacy,” Powers says.

Resources that can help you evaluate a VPN provider include EFF’s Surveillance Self-Defense and the VPN section of That One Privacy Site.

Another way to ensure private browsing is to use Tor software, which protects users by bouncing their communications around a distributed network of relays run by volunteers. It prevents somebody watching your internet connection from learning what sites users visit, it prevents sites from learning users’ physical locations, and it lets users access sites that are blocked. Tor is not always available on library public computers, however.

8. Create strong passwords with a roll of the dice

dicePowers teaches workshops on digital literacy and data privacy as part of the Data Privacy Project, which is funded by the Institute of Museum and Library Services and the Knight Foundation’s Prototype Fund. Password security is always a hot topic.

She says that Diceware is an easy way to teach patrons how to create better passwords for their library, service, and email accounts. By rolling an ordinary die, users create a five-digit number that corresponds to a word in a Diceware word list. The Electronic Frontier Foundation (EFF) maintains multiple Diceware word lists.

“Even though the list of words is publicly available, the security of a Diceware password comes from the number of words selected and the number of available words on a Diceware list,” Powers says. “By rolling dice to create several words in a sequence, you can create a strong, memorable passphrase. The creator of Diceware, Arnold Reinhold, currently recommends a six-word sequence to protect against a brute-force hack attempt.”

9. Streamline with data-driven development

Jones says he thinks data-driven development is a trend to watch.

“An example of [data-driven development] would be the article gateway that we’ve produced out of IDS Project,” he says. “This was looking at the worst areas in workflow and finding ways to improve.” Those ways include shaving time off processing transactions and saving money while making sure that data gets to users as fast as possible.

IDS Project has a wide range of members, from small community colleges to Research I institutions and some public libraries like New York State Library.

“You’ll see many different consortia building their own type of data analysis tools that they can use to strengthen their communities and their groups of libraries to benefit everybody, whether that’s decreasing time in shipping or cutting pieces out of workflow that can be automated or streamlined,” Jones says. “I really think it takes a group to look at data across many different libraries and library types.”

He adds that he thinks more tools will be coming out, especially since more service providers are offering APIs out of which data can be pulled and used to make intelligent decisions on transactions.

“I just think that communities are going to start to grow together more to find solutions on their own because they’re not going to be able to afford the solutions that are coming out of these big companies. People just can’t afford it,” Jones concludes. “So we’re going to be relying on each other as a library community to find these solutions.”

10. Develop your own applications

Libraries can also develop or improve applications themselves. For example, University of Michigan Library redesigned its link resolver interface in late 2016.

The library had been using a link resolver interface that Senior Program Manager Ken Varnum says was confusing and out of date, did not meet accessibility requirements, looked like an error page, and did not provide detailed analytic data on user behavior.

His team decided to replace that application with a custom solution created using Umlaut, an open source option. They created a design that makes the page’s purpose clear with an accessible interface, improved usability, and better analytics.

“Now all those open URL link transactions happen on a library server where we can provide the services that we want to our user base directly, and we can make it work just the way we want to,” he says.

If libraries want to develop their own applications, Varnum says that looking for open source software that does something similar to what you want to do is key. Even if you can’t find software that does exactly what you want, you may be able to find something close.

You can then either rethink your end goal—is the existing software close enough to your intended goal that you can live with the differences?—or modify the software to meet your specific need.

“I’ve noticed that, many times, libraries let the great get in the way of the good and ignore the 80–20 rule,” Varnum says. “That is, they can get 80% of the way toward their goal with something easy to implement, but they let the 20% customization of the interface be a blocker.”

If you do customize the software or write your own, Varnum recommends sharing your innovations and changes with the open source community to make it available to others.

Source: American Libraries

Sunday, May 14, 2017

Educause Review: Transforming Our Libraries from Analog to Digital: A 2020 Vision

By Brewster Kahle
March 13, 2017

By 2020, we can build a collaborative digital library collection and circulation system in which thousands of libraries unlock their analog collections for a new generation of learners, enabling free, long-term, public access to knowledge.




Today, people get their information online — often filtered through for-profit platforms. If a book isn’t online, it’s as if it doesn’t exist. Yet much of modern knowledge still exists only on the printed page, stored in libraries. Libraries haven’t met this digital demand, stymied by costs, e-book restrictions, policy risks, and missing infrastructure. We now have the technology and legal frameworks to transform our library system by 2020. The Internet Archive, working with library partners, proposes bringing millions of books online, through purchase or digitization, starting with the books most widely held and used in libraries and classrooms. Our vision includes at-scale circulation of these e-books, enabling libraries owning the physical works to substitute them with lendable digital copies. By 2020, we can build a collaborative digital library collection and circulation system in which thousands of libraries unlock their analog collections for a new generation of learners, enabling free, long-term, public access to knowledge.

The Problem

We all want to see the modern-day Library of Alexandria, a digital library where the published works of humankind — all the books, music, video, webpages, and software — are available to anyone curious enough to want to access them. I believe now is the time to build it.

The technology and costs to achieve this vision are now understood, and in fact, various projects are proving that it can be done. Three major entities have digitized modern materials at scale: Google, Amazon, and the Internet Archive, probably in that order of magnitude. Google’s goal was to digitize texts to aid user search and its own artificial intelligence projects. Amazon’s book-digitization program helps customers browse books before purchasing them; Amazon is quiet about the number of books it has scanned and any future plans for them. The Internet Archive has digitized more than 2.5 million public domain (pre-1923) books and made them fully downloadable and 500,000+ modern (post-1923) books and made them available to the blind and dyslexic and through its lending system on its Open Library site.

Yet bringing universal access to all books has not been achieved. Why? There are the commonly understood challenges: money, technology, and legal clarity. Our community has been fractured by disagreement about the path forward, with ongoing resistance to some approaches that strike many as monopolistic. Indeed, the library community seems to be holding out for a healthy system that engages authors, publishers, libraries, and most importantly, the readers and future readers.

I suggest that by working together, we can efficiently achieve our goal. This will require the library community working with philanthropists, booksellers, and publishers to unleash the full value of our existing and future collections by offering them digitally.

For the books we cannot buy in electronic form, I am proposing a collaborative effort to select and digitize the most widely held and used books of the 20th and 21st centuries, and to build a robust system to circulate the resulting e-books to millions and eventually billions of people.

Mike Lesk, considered by many to be the father of digital libraries, once said that he was worried about the books of the 20th century and noted that we haven’t figured out “institutional responsibility” in our digital world.1 He believed that the materials up to the 19th century would be digitized and available and that the 21st-century materials, since they were born-digital, were going to be circulated effectively. But the 20th-century materials, he thought, would be caught in machinations of copyright law — most remaining out-of-print, and all seemingly locked up by late-20th-century laws that appeared to make digitization risky.

As we shift from the analog to the digital era, Lesk’s comment about “institutional responsibility” is also apt. Today, public, university, and national library leaders are not clear how best to perform their preservation and access roles, at a time when subscribing to remote databases is increasingly common and when publishers are trying to adapt to a world in which distribution is increasingly consolidated among a few powerhouses. If we are to have healthy publishing and library ecosystems, we need many winners and not just a few dominant players. But how do we achieve that?

A step forward would be for libraries to buy e-books when they can, but also to transform efficiently the books currently on our physical shelves to sit on our digital shelves as well. Patrons could then easily borrow either the physical books or the electronic versions.

Open Library: Building on a Six-Year Pilot

Since 2010, the Internet Archive’s Open Library has been piloting collaborative collection and lending of 20th-century books contributed by dozens of libraries (see figure 1).2 For six years, we have been buying e-books or digitizing physical books to lend. We now lend more than 500,000 post-1923 digital volumes to one reader at a time via the Open Library website. This digital circulation mechanism employs the same protection technologies that publishers use for their in-print e-books distributed by commercial operations such as OverDrive and Google Books. Watching Open Library being used by millions over the years, we have found this approach to work. The time is ripe to go much further!

Figure 1. The Internet Archive’s Open Library
Using the Open Library approach as a foundation, we can expand to bring all interested libraries digital by 2020. By building upon the collection of 2.5 million public domain e-books that so many libraries have collaboratively digitized with the Internet Archive, we can bring the full breadth of books, both past and present, to millions of readers on portable devices, at websites, and through online library catalogs. With its extensive collections and strong public service mission, the library community can be central to this endeavor.

For instance, in each library’s online card catalog, when a digital version of a book exists, we can include a web link on the record for the physical book, giving readers the ability to browse the book on screen or to borrow it from the convenience of their homes. In this way, we can smoothly enhance a library’s collection, from analog to digital, at scale, by coordinating through the library catalog cloud-based vendors. We would also collectively work with publishers to purchase as many books as possible for library lending.

To build this future, we will need the participation of multiple sectors to bring thousands of libraries digital. That is one of the essential differences from the 2004 Google Book Search project, an attempt by Google and several large research libraries to bring 20th-century books online in a centralized way. That path yielded, in 2008, the Google Books settlement proposing a central controlling authority, which the courts halted in 2011 as monopolistic.3

A System with Many Winners

I believe this time we can pursue a decentralized approach, one that leads to many publishers and many libraries interacting through the market rather than having a single controlling entity. While libraries today often license e-books with restrictive terms, libraries are better served if they purchase e-books with the same rights to lend and preserve that they are entitled to when they purchase physical books today. Hopefully, going forward, all books would be available to libraries in this way — providing revenue to ensure healthy author and publisher sectors that would garner their support. But what about books that are not available in this form — including most of the existing library collections and some books published today? For these texts, libraries can work together to digitize the materials efficiently while minimizing duplication and can lend the digital texts with the same limitations placed on physical books.

In this way, patrons could read past and present books on the screens of their choice; librarians would perform their traditional roles of purchasing, organizing, presenting, and preserving the great works of humankind; publishers would sell e-books at market-based rates; and authors could choose how to distribute their works, including through publishers for payment. This may sound old-fashioned and not particularly “disruptive,” but it bears the advantage that each institution plays a role structurally similar to the role it has played historically.

Different Eras of Books: Different Solutions

To bring our libraries digital, let's first discuss ways that groups are digitizing books at scale and then address how they can be made maximally available. The historical core of a great library, often pre-1923 books, resides in the public domain and thus does not have rights issues to hamper distribution. Libraries with their rich special collections must still catalog and digitize their books, and we continue to work with hundreds of libraries to bring their special collections digital. But the large swath of public domain works has largely been digitized twice in the last ten years: once by the libraries working with Google and once by the libraries collaborating with the Internet Archive. Google’s project has been much more thorough in its scope, scanning an estimated 25 million books thus far, but unfortunately, access to these works is limited. Institutional subscribers can gain limited access to the Google books through HathiTrust, and the public can download some public domain books, one at a time, through the Google Books website. The Internet Archive’s digitized 2.5 million older books, on the other hand, are available in bulk and for free public access. Indeed, content specialists from genealogy to biodiversity researchers actively download public domain materials from the Internet Archive, fueling innovation, dissemination, and broad public good. While we still need to complete the digitization of special collections and government documents, the pre-1923 corpus of published books is largely online and available, albeit often with restrictions.

The 20th-century books, the era that worried Lesk, are also the books librarians fret about due to rights issues. In most of the developed world, an organization can digitize books for the blind and dyslexic, and through the Marrakesh Treaty (2013), signatory countries can share these books with other signatories at scale in a way that is explicitly legal.4 In practice, this means Canada can now digitize and lend a book from any era for the reading disabled and can share those digital copies with libraries in Australia or more than two dozen other countries. Furthermore, the U.S. court’s ruling in Authors Guild v. Google found the basic act of mass digitization of books, even by commercial entities, to be legal under the “fair use” doctrine in the United States. So the right to digitize has been settled in many countries. A remaining legal question is what access is allowed; this proposal will allow different libraries to make their own decisions.

I believe that building a major library at the scale of the Princeton University Library, the Yale University Library, or the Boston Public Library would require institutions to offer access to a curated digital collection of 10 million books, most of which are post-1923. Collaborators can prioritize subsets of books, such as the 1.2 million books most widely held by libraries according to OCLC or the almost 1 million books that appear on one or more syllabi as determined by the Open Syllabus Project.5 A team of collaborators could volunteer to ensure full coverage in the major subject areas while building on the core collection. But for the purposes of argument, let’s stipulate that 10 million books is the number we would need to support a broadly useful public digital library system.

Collaborating to Build a Digital Collection

Building a collaborative digital collection of 10 million books will require our libraries and our partners to efficiently perform three functions:

  • Coordinate collection development to avoid duplicating effort
  • Offer local and cloud access
  • Provide distributed preservation

In very broad strokes, to build the collections, we need curators or curatorial approaches for selecting the most useful books, then a process to determine which books we already have digitized. We need institutions or vendors able to source the missing physical books to be digitized. The participating organizations would need to have the funding to staff these functions, based either on their internal budgets or on funds raised from philanthropic sources. Maybe we could start with some already funded projects, since they might help shape the rest of the system.

Curating a Collaborative Collection

Prioritizing the books is still an open question. One approach might be to break the collection into a widely-used core of books for K-16 learners and into important topical collections. The Internet Archive could focus on obtaining and scanning the core collection of perhaps 1–2 million volumes, and then partner libraries with strong specialties could develop and scan the subject-based collections. An engineering school might take on engineering books, and a law school could focus on law books.

We must continue to work with Google Books, HathiTrust, and Amazon to explore areas of alignment. No one in the library world wants to waste precious resources by digitizing a text more than once. It would be a public benefit if these large-scale digitizers would be willing to contribute to this collaborative effort.

We will also need to research which books are emerging from copyright protection and create a comprehensive list of all digitized works. These will be important areas of research to support.

Various Levels of Access

Once we have established the core collections, each library can determine its own approach to providing access to modern works. Some might want to start by giving full access to the blind and dyslexic, as the University of Toronto is doing through the Ontario Council of University Libraries (OCUL) and the Accessible Content E-Portal. Others, such as the University of California, might want to create a preservation copy. Some, such as HathiTrust, might prepare datasets for nonconsumptive researcher access. And many others, including the Internet Archive, may choose to lend their copies while keeping the physical copy on the shelf. This flexibility in access models could be one of the great strengths of this overall approach to bringing 20th-century books online — different libraries in different countries can play varying roles as their environment permits.

Libraries can take a giant step forward in the digital era by lending purchased and digitized e-books. The Internet Archive digital e-book lending program mirrors traditional library practices: one reader at a time can borrow a book, and others must wait for that one to be returned manually; alternatively, after two weeks the book is automatically returned and is offered to any waiting patrons. The technical protection mechanisms used to ensure access to only one reader at a time are the same technologies used by publishers to protect their in-print e-books. In this way, the Open Library site is respectful of rights issues and can leverage some of the learning and tools used by the publishers. The California library consortium Califa has set up its own lending server, and it makes purchased and digitized books available through its own infrastructure to California residents. We understand the Department of Education in China also loans books it owns to one reader at a time at a major Chinese university. We all learn and benefit when different organizations in different countries test a range of approaches to access, balancing convenience and rights issues.

How would we circulate the digital e-books? Some libraries are integrating links into their library catalogs, so information about the digital versions and physical copies are side by side in the same record. Libraries can always link to the copy in the Internet Archive’s Open Library, but if this is a modern book, there may be only one copy available for the whole world. Libraries can also store their own digital copies and administer their own lending system, as Califa has done. Another alternative is that the Internet Archive could create a circulation system that would administer the lending for libraries. In effect, then, each library can choose from a variety of methods to lend digital versions of the physical books in its collection. This would keep the local libraries in control but leverage the convenience of a cloud-based system that others maintain and update.

Turning on the e-book links in a catalog might be very easy now that many libraries have their catalogs on cloud services from major catalog vendors. Persuading those providers to collaborate with this community could help deliver e-books to millions of patrons with a flip of a digital switch.

Distributed Preservation

If we are striving to build the modern-day Library of Alexandria, we should avoid the fate of the first Library of Alexandria: burning. If the library had made another copy of each work and put them in India or China, we would have the complete works of Aristotle and the lost plays of Euripides. Our community should preserve multiple copies of the books that are bought and digitized. While many libraries may be content with access to the collection on a cloud-based server, we can empower and encourage a number of libraries to store local digital copies of their books.

Fortunately, digitized books are compact enough to be affordable for libraries to store. Digital books, even with high-resolution images and all the derivative formats, are often 500 megabytes in size, so 1 million books would be 500 terabytes, which is increasingly affordable.

Distributed preservation of both the purchased e-books and the digitized books can help ensure the longevity of the precious materials in our libraries.

The Internet Archive’s Funding and Technology

The Internet Archive has secured new funding to develop “super scanning centers” for the mass digitization of millions of books per year, at a significant cost savings. With the first funded super scanning center in Asia that we are now certifying for production, we anticipate being able to scan books for about one-third of the normal in-library rates achieved by the Internet Archive’s twenty-eight Regional Scanning Centers. Through the Asian super scanning center, the Internet Archive can offer partners a cost savings of 50–60 percent for those willing to scan large quantities of books and have them out of circulation for several months. We are now talking with a large university research library about a plan to digitize 500,000 modern books using an Internet Archive super scanning center. This project offers the library new options in collection management, allowing it to provide digital access to books that are moving to an offsite repository. Librarians may find mass digitization at reduced cost to be a powerful tool for collection management.

In the past year, the Internet Archive has developed an in-library book-scanning system that integrates duplication detection, catalog lookup, digitization, and integrated delivery. This can be useful for organizations that want to move through their collections, discover what has not been digitized either by themselves or by others, and digitize just these texts — while gaining access to the Internet Archive’s digitized versions of all of their books, digitized from a large variety of source libraries.

Also, we now have a funding commitment to digitize millions of books and other materials that are donated to the Internet Archive. Through this initiative, the Internet Archive will seek to acquire and then digitize a core collection of books based on the recommendations of a curatorial team, while considering lists such as those compiled by OCLC and the Open Syllabus Project. This funding gives other organizations the option to donate appropriate physical books to the Internet Archive and receive a digital copy in return, at no cost to their institution.

In these ways, libraries can choose the most appropriate means of scanning their holdings. We now offer options ranging from the Table Top Scribe (see figure 2), where institutions purchase the hardware and supply their own staffing, to our regional centers in institutions such as the Boston Public Library, the University of Toronto, the Princeton Theological Seminary, and the Library of Congress. We offer lower costs for mass digitization at our Asian super scanning center and free digitization for appropriate materials donated to the Internet Archive. Our goal in offering this plethora of scanning options is to encourage all libraries to participate in the collaborative collection building in a paradigm that works for them.

Figure 2. The Internet Archive’s Table Top Scribe, a Portable, Low-Cost Scanner
Costs of Digitization

At the Internet Archive, the cost of digitization varies between $10 and $30 per book, depending on where the scanning occurs — offshore or in a library. Additional costs include acquisition, storage, and lifetime digital file management, which may come to be the predominant cost in the future.

Current in-print books are often available in e-book form, but there are few publishers willing to allow libraries to buy e-books with similar rights to the physical books they purchase. There is hope that if we coordinate our buying power, the book publishers will embrace selling e-books to libraries, much as the music publishers have come to embrace, or were forced to embrace, the selling of MP3s to services that provide broad access.6 When available, the purchase price for these e-books tends to be approximately the same as the cost of the physical book.

Financial Stability

So far there has been little discussion of money changing hands or of any financial model to support maintaining and growing this system. If the libraries share the burden of the digitization and share the results, there would then be an incentive for some to “freeload” and wait until other libraries digitize the books and provide the services. If we want to counter this, those libraries that did not contribute digitization or backend services could be charged for access to digitized books. And we could charge a one-time transfer fee to libraries that want to store their own local copies. But we should think carefully about financial models and avoid incentives leading to dominant systems that will limit innovation.

Conclusion

Each of our organizations has a role to play in building this collaborative digital library collection and circulation system. The Internet Archive is ready to contribute scanning technology, backend infrastructure, and philanthropic funding to digitize a core set of books that will serve K-16 learners. We are calling for partners who will help curate and source the best collections beyond what we can do, for vendors who will help circulate digital copies, and for leaders who are bold enough to push into new territory.

Because today’s learners seek knowledge online, we must enable all library patrons to borrow e-books via their portable devices, by searching the web or by browsing online library catalogs. By working together, thousands of libraries can unlock analog collections for a new generation of learners, enabling digital access to millions of books now beyond their reach. The central goal — for future learners to have access to all books without physical constraints — could be realized for millions of people worldwide by the year 2020.
Note: An earlier version of this article was published as the white paper “Transforming Our Libraries into Digital Libraries: A Digital Book for Every Physical Book in Our Libraries,” Library Leaders Forum, Internet Archive, San Francisco, October 2016.
Notes
  1. Mike Lesk, personal conversation with the author.
  2. Geoffrey A. Fowler, “Libraries Have a Novel Idea,” Wall Street Journal, June 29, 2010.
  3. James Grimmelmann, “The Orphan Wars,” EDUCAUSE Review 47, no. 1 (January/February 2012).
  4. “Marrakesh Treaty to Facilitate Access to Published Works,” World Intellectual Property Organization (WIPO), accessed February 4, 2017.
  5. “OCLC Provides Downloadable Linked Data File for the 1 Million Most Widely Held Works in WorldCat,” OCLC, news release, August 14, 2012; Open Syllabus lists 933,635 texts as of February 2017: http://explorer.opensyllabusproject.org/.
  6. Steve Jobs, “Thoughts on Music,” Apple (via Internet Archive Wayback Machine), February 6, 2007.
Source: Educause Review

Thursday, May 4, 2017

Ars Technica: As US prepares to gut net neutrality rules, Canada strengthens them

Canada cracks down on zero-rating while FCC allows paid data cap exemptions.


April 21st, 2017
by Jon Brodkin

Canada is taking a much stronger stand against data cap exemptions than the United States.

In the US, the Federal Communications Commission's new Republican leadership signaled that it won't enforce net neutrality rules against zero-rating, the practice of favoring certain Internet content by exempting it from customers' data caps. The FCC made that clear when it rescinded a determination that AT&T and Verizon Wireless violated net neutrality rules by letting their own video services stream without counting against customers' data caps while charging other video providers for the same data cap exemptions.

Canada is also taking a case-by-case approach to zero-rating instead of banning it outright. But yesterday, the Canadian Radio-television and Telecommunications Commission (CRTC) ordered changes to one carrier's zero-rating program and announced that it will enforce stricter guidelines for determining whether zero-rating programs are discriminatory.

Zero-rating "generally gives an unfair advantage or disadvantage to certain content providers and consumers," CRTC said in an announcement. The group said that it is "strengthen[ing] its commitment to net neutrality," and it also published detailed guidelines and its decision against Videotron, a telecom whose "Unlimited Music" program exempts certain online music providers from data caps of subscribers with certain mobile data plans.

Zero-rating music or video not allowed


The new policy "supports the freedom of consumers and citizens to access the online content of their choice without being unduly influenced by the marketing strategies and pricing decisions of ISPs with respect to the transmission of specific content," the CRTC said. "It also supports the ability of all content providers to innovate and encourages ISPs to compete and innovate based on the capabilities of their networks, as well as to offer a range of speed- and volume-based data packages to provide better choices to Canadian consumers."

The CRTC says that zero-rating should be open to all types of online services. Thus, zero-rating programs that exempt a broad category of content—such as video—would likely violate the CRTC policy even if the programs are open to all video providers and even if the video providers don't have to pay the ISP.

Canada's stance against zero-rating (which Canada also refers to as "differential pricing practices") appears to be even more strict than the FCC's was under former Chairman Tom Wheeler. Wheeler, a Democrat, determined that paid data cap exemptions as implemented by AT&T and Verizon were discriminatory. But he gave the green light to T-Mobile's zero-rating programs that exempted a wide range of video and music services from data caps without requiring payment.

Ajit Pai, the new Republican chair of the FCC, argues that free data is good for consumers even when carriers are exempting their own online services while charging competitors for the same data cap exemptions. Pai is also reportedly developing a plan to eliminate the FCC's net neutrality rules and replace them with "voluntary" commitments that would be enforced by the Federal Trade Commission.

Canada, on the other hand, found that Videotron's program was discriminatory even though the carrier said it wasn't charging music providers for the data cap exemptions. (Videotron did require music streamers to meet certain technical requirements.) Zero-rating based on "content/application categories raise significant concerns regarding the selection, definition, and implementation of the categories, in addition to... likely negative impacts on competition, consumer choice, and innovation," the CRTC said. "The Commission therefore considers that content categories, even broad, apparently all-encompassing ones, would not mitigate the negative impacts of content-based differential pricing practices."

Videotron was ordered to change its Unlimited Music program by July 19 to bring it into compliance. The company had argued that Unlimited Music is "a democratic program that allows participation by any music streaming service provider that meets the program’s technical criteria," and that "wireless service providers like itself must adopt new strategies to improve and differentiate their services in order to attract new customers," according to the CRTC decision.

"Proponents of differential pricing—which include Bell, Telus, Videotron and Facebook, which relies on zero-rating to offer its social network for free around the world—argued that the practice was good for innovation and would mean more choice and lower costs for consumers," the CBC wrote.

 

Complaints-based system


The Canadian regulator said that ISPs don't need to seek permission from the government before implementing zero-rating programs. But if an ISP is unsure about whether a program violates the rules, it can ask the CRTC for a decision on whether the program would be allowed before launching it. After programs are launched, the CRTC will use a complaints-based system to determine whether a zero-rating program is discriminatory.

The CRTC will judge programs based on four criteria: "the degree to which the treatment of data is agnostic (i.e., data is treated equally regardless of its source or nature); whether the offering is exclusive to certain customers or certain content providers; the impact on Internet openness and innovation; and whether there is financial compensation involved."

Of those, "the degree to which the treatment of data is agnostic will generally carry the most weight," the CRTC said. "In any evaluation, the Commission will also consider whether there are any exceptional circumstances that demonstrate clear benefits to the public interest and/or minimal harm associated with a differential pricing practice."

The CRTC said it can impose monetary penalties for violations, and that it will try to address complaints quickly in order to "minimize the risk of regulatory gaming."

ISPs will be allowed to zero-rate administrative functions that let their subscribers monitor data usage and pay bills online. The CRTC rejected calls to allow zero-rating for short-term marketing programs, such as trial periods for video games, saying that such an exemption "would lead to a risk of regulatory gaming and would not mitigate the negative impacts of such practices." The CRTC also rejected the idea of allowing zero-rating for social needs because "defining a content category is problematic; it is all the more so if the category is meant to define something as broad and subjective as 'social good.'"

OpenMedia, an advocacy group that submitted comments in the CRTC proceeding, praised Canada's decision, saying that "telecom companies use zero-rating schemes to artificially pick winners and losers online, and to deflect pressure from customers for larger and more affordable data caps, or an end to data caps altogether."

OpenMedia is satisfied even though the CRTC did not ban zero-rating entirely. "The onus is still on consumers and advocacy organizations like OpenMedia to complain to the regulator if any service offered by a provider violates the new framework, but the new rules cover 99 percent of the problematic cases we've seen emerge in markets like the US," OpenMedia Campaigns Director Josh Tabish told Ars. So far, the CRTC "has shut down every instance of an anti-competitive zero rating scheme" in Canada, and the new framework should speed up the complaints process, he said.

Canada is trying to encourage carriers to offer unlimited data plans, particularly on home Internet service. The CRTC recently declared that all Canadians should be able to purchase home Internet with 50Mbps download speeds and 10Mbps uploads, and it created a $750 million fund for areas where that level of Internet service isn't available. The money can also be used to boost mobile networks but without any requirement for unlimited mobile data.

Source: Ars Technica