Tuesday, November 13, 2012

The cloud is the library, according to a Dutch think tank

Every four years, the scientific technical council of the Dutch SURF foundation publishes a trend report. Aim of this report is to chart the main trends in IT that will affect institutions of higher education over the near future. Main themes of this year’s report are the cloud and the trend that users are increasingly looking for and creating their own solutions with what’s available on the web itself.

The report is, unfortunately, only available in Dutch at this page.

I was particularly interested in the chapter on academic libraries written by John Mackenzie Owen and Leo Plugge.

Their basic premise is that scholarly information has increasingly moved to the cloud and that the Open Access share of that scholarly information is increasing. Combined, these two trends have profound consequences for academic libraries.

At the same time, there are some interesting challenges for students and scholars in this new world in the cloud. Search skills are becoming more and more important. There is not yet a clear solution for OA author fees. New forms of publishing are emerging, so called enhanced publications that also call for new skills, among which legal skills when it comes to reuse of objects. The uncontrolled proliferation in available data sets that is a chaos at the moment.

Collections, that were at the core of the academic library, have moved to the cloud and, with OA on the rise, increasingly open available. The library, nor the library web site is no longer the first stop in students’ and scholars’ quests for information. Libraries will have a role when it comes to curating historical print material, in supporting subject fields that are only slowly digitizing like humanities (although I think digital humanities initiatives are building up critical mass very fast now), and, thirdly, as a ‘living room’ for undergraduate students.

Examples of users taking the lead in finding their own solutions are ArXive, the Web, Google, PLoS, and Mendeley. And it is true that all these initiatives started in research institutions and universities. A far more important trend, I think, is that students and scholars are no longer dependent on their institution when it comes to creating and publishing content collaboratively. Hardware is becoming cheaper and cheaper, internet access is ubiquitous and the web is rife with free services that can act as an alternative to solutions traditionally offered by the IT department, like a learning management system.

An interesting observation is that Dutch universities spend proportionally more on their libraries (+3 percent of their budget) that US universities (-2 percent). I am not sure whether it is possible to make such a sweeping statement, but the implication seems to be that there’s an opportunity for Dutch universities to save money. Money that could be used to support scholars in OA publishing, for instance.

The chapter concludes with stating that the cloud is the new library. It is also recommended that SURF organizes a vision group, in which (advanced) users are strongly represented, to see how a Dutch node in this cloud could be built. The cloud though, I'd guess, doesn’t stop at borders.

Tuesday, October 23, 2012

ARL white paper on research libraries and MOOCs

ARL has released a 15 page white paper on “Massive Open Online Courses: Legal and Policy Issues for Research Libraries

The executive summary is rather concise:

“Massive Open Online Courses (MOOCs) raise significant legal and policy questions forresearch libraries, which are often asked to support the development of MOOC courses.These questions involve information policy concerns that are central to researchlibraries, including the proper application of fair use, the transition to open access as thedefault mode of scholarly publishing, and the provision of equal access to learningmaterials for students with and without disabilities. Where possible, research librariesshould engage in conversations around MOOCs and promote their core values. Bydoing so, they will also promote the continuing vitality of libraries as partners in theeducational mission.”
Hopefully, all these legal issues will not put libraries in the role of road block in the development of MOOCs.

Also, I think it would be interesting to consider how MOOCs could play a role in further education for library staff. Especially so called cMOOCs seem highly fit for developing staff skills in fast changing environments. But then the ARL white paper only seems to know about xMOOCs.

Wednesday, July 25, 2012

Yesterday a badge, today a certificate

The certificate is for finishing the Google Power Search Course. Actually I finished it last week but assessments could be taken until a few days ago. It was Google first attempt at a MOOC and they did quite a nice job. You can still view all the material and browse the forums and view some Google+ hangouts with search experts that were recorded but it's no longer possible to do the assessments anymore and, thereby, earning the certificate. I suggested in the final evaluation that Google should consider making the course anytime, anywhere available.

The certificate looks nice:


Tuesday, July 24, 2012

Got my self a badge!



And it really was for looking at an extended commercial for Google Apps for business - and, of course, answering some questions about that commercial.

Nevertheless, ALISON has an interesting approach when it comes to e-learning.

Sunday, July 15, 2012

Thursday, May 31, 2012

Elsevier and Dutch Royal Library testing private access to articles

Two days ago, May 29, the Dutch Royal Library published a press release (in Dutch only) in which it announces an experiment with Elsevier to give private persons and independent researchers ‘cheap’ access to about 2.5 million articles in about 400 Elsevier journals, mainly medical ones. Cheap means a price of 6.50 euro per article. The experiment is scheduled to run until the end of the year. Should the test turn out to be successful, Elsevier and the Royal Library may decide to continue and expand the service.

An interesting experiment. Obviously, Elsevier is confident that it can segregate this market from the profitable university library market, and, more importantly in this case, the corporate library market, at least in the Netherlands, for now. Strange though that the Elsevier website doesn’t mention this news anywhere, at least yet.

Unfortunately, the press release does not mention how success of the test will be measured. A simple test would be profitability and it should not be too hard to score on that one. The cost of setting up the search engine for the Elsevier journals that are already hosted on the Royal Library’s servers should be marginal, so with a couple of thousand articles sold, the service would already be a success. Nevertheless, marketing of such a new service takes time and 7 months for this experiment seems to be short.

But there might also so be a more strategic goal behind the experiment. It looks to me that Elsevier is sort of creating an alternative road for cheap access to specific user groups, much like their program for developing countries. This can then be used as an argument in the ongoing open access debate, much like: Look, we poor publishers are only saying that publishing costs a lot of money, and see, we are doing our very best to create affordable access options for poorer customer groups.

Tuesday, May 1, 2012

New Ithaka report: Barriers to the adoption of online learning

Just today,  Ithaka released a new report under the title “Barriers to the adoption of online learning in U.S. higher education".

From the blurb on the Ithaka site:

“This Ithaka S+R report is a landscape review of important developments in online learning today.  It is the first in a series that will provide leaders in higher education with lessons learned from existing online learning efforts to help accelerate productive use of these systems in the future.  The goal of this research was to understand what benefits colleges and universities expect from online learning technologies, what barriers they face in implementing them, and how these technologies might be best shaped to serve different types of institutions.”
I just finished reading the report and must say that I am totally unimpressed. And surprised, the quality of Ithaka reports is usually quite high.

There are three main issues I have with the report.

There is nothing much new to be learned from their analysis of the current state of e-learning in higher education. The obstacles to the implementation of online learning that are discussed in the report have been known for a long time now, and, by consequence, the strategies mentioned for overcoming these barriers should be well known by HE administrators.

Part of the problem stems from the fact that the report is about e-learning systems that are not yet here. The authors coin the phrase Interactive Learning Online (ILO) to differentiate such a system from our current, poorly used and poorly performing, learning management systems. An ILO system, or platform, is different in that it would use largely machine-guided learning, data-driven, adaptive and customized to individual students, that would also assist instructors in delivering targeted guidance. One of the report’s recommendations is for a national, system-wide initiative to develop such a platform. Instead of a new initiative, I think it would be much wiser to support the open source communities around Moodle and Sakai in order to capitalize on their vast experience with systems that have been actually deployed for some time now.

Also, the authors call for educational content that can easily be customized and adapted by faculty. We used to call that the not-invented-here syndrome, ten years ago, and of course the argument is still valid and the obstacle is a very real one. As with the software development issue, I would at least have expected the authors to mention Open Educational Resources as a possible solution to this problem.

Wednesday, April 11, 2012

Big data in education





The US Department of Education just released a draft version of a brief titled Enhancing Teaching and Learning through Educational Data Mining and Learning Analytics.

One of the old promises of e-learning has been to use data generated in online learning systems to guide student learning, as well as to help instructors and designers, and managers to continually improve the online learning system. In practice most institutions do very little with the data they have, simply because they lack the skills to handle the data, or just don't know what to do with the data. After all, you can only make data 
work for you when you know which questions you'd like to see answered.

Technology to deal with big data has been developing rapidly lately, and the DoE thinks we might be at a tipping point, so it released this timely brief.

The report starts off with some familiar scenarios, think Netflix applied to education, before making an interesting, and useful distinction between educational data mining and learning analytics:

Educational data mining (EDM) develops methods and applies techniques from statistics, machine learning, and data mining to analyze data collected during teaching and learning. EDM tests learning theories and informs educational practice.
Learning analytics applies techniques from information science, sociology, psychology, statistics, machine learning, and data mining to analyze data collected during education administration and services, teaching and learning. Learning analytics creates applications that directly influence educational practice.

The Journal of Educational Data Mining started in 2009. In 2011 both the International Educational Data Mining Society and the Society for Learning Analytics Research were founded. New societies and journals usually mark the birth of new academic fields.

There is much more good information in the 57 pages report and it will be interesting to see how the response to the brief develops.

Thursday, April 5, 2012

Recommended reading: Planning for Big Data



Big data is one of the buzz phrases this year. If you’d like a quick and well written introduction to the subject, you’re lucky. O’Reilly recently released a free e-book on the subject: Planning for Big Data. A CIO’s Handbook to the Changing Data Landscape.

In 10 chapters and under 80 pages, the book introduces the concept of big data and the importance for today’s businesses, and that would include the world of education, research and libraries.

Chapter 3 introduces the (open source) software Hadoop that is at the core of many big data applications. Chapter 4 offers a survey of the market and its main players: EMC Greenplum, IBM, Microsoft and Oracle. Chapter 5 takes a closer look at Microsoft’s strategy in the area of big data.

Chapter 6 discusses the close relation between the cloud and big data and looks at platform solutions by Amazon, Google and, again, Microsoft. Chapter 7 discusses the rapidly developing market for data. Chapter 8 discusses NoSQL, the open source tool for analyzing large amount of unstructured, heterogeneous data.

Chapter 9 discusses visualization as a way of extracting meaning from data but only very superficially. Chapter 10 closes off the book with an outlook on the near future.

If you’d like to learn more about big data and got 90 minutes to spare, I can recommend this book.

Wednesday, April 4, 2012

The state of Open Access

One has to admire people like Richard Poynder. Since 2001 he publishes a series of Open Access interviews. And the series is, of course, Open Access (OA), under a Creative Commons BY NC ND license.

The latest, February 2012, installment is an interview with Michael Eisen, one of the founders of Public Library of Science (PLoS).

The interviews tend towards tl;dr, the one with Eisen is a 19 page pdf file, that begins with a 6 page introduction and continues with an extensive interview for the latter 13 pages. However, and I have read most of the interviews in the series, the reader is always rewarded with a fresh view on the ongoing OA debate. My interest in the subject dates to way back 1994 when I gave a talk about electronic journals at the INET conference that year in Prague. This resulted in an article in the next year in the Journal of Information Networking, itself also probably tl;dr, these days (13 pages if you’d print the html file).

As in many interviews in the series, central questions are about the apparent contradiction between green (authors self archiving their research papers) and gold OA (OA journals); business models - yes, there ain’t no such thing as a free lunch although Eisen thinks the costs could approximate zero if faculty do all the work (the costs for PLoS, that would be); and how the future of OA will develop - what with the recent commotion about the Research Works Act (RWA) in the USA, which led to harsh but justified criticism of Elsevier.

My main takeaway from the interview is that Elsevier’s support for the RWA is a (not so) covert attack on green OA. After all these years they are obviously really worried about Steve Harnad’s 1994 subversive proposal. If you were looking for a strong argument for green OA, here you have one. Whether most of the scientists that signed a pledge will indeed refrain from submitting or reviewing articles to or for Elsevier journals remains to be seen. The Poynder interview references a similar venture in the past that didn’t make any difference at all.

What worries me more though is that I don’t see really much progress in the OA field. And it has to do with the deep conservative attitudes of both faculty and libraries. Earlier this month I came across a dissertation titled: "The Influence of the National Institutes of Health Public-Access Policy on the Publishing Habits of Principal Investigators". The abstract of the dissertation said: there's no influence. I didn’t / couldn’t bother to read on.

My point is, why do we need journals anymore? And yes, I know all the answers why we still need them, but somehow, I find these answers less and less convincing as the years in this debate pass. Physics has arXiv since the early nineties, why would funding bodies and libraries want to take over particle physics journals (SCOAP3)? Surely, in 2012, it should be possible to overcome our outdated tenure procedures based on bad metrics like impact factors. That’s the conservatism on the faculty side. But there’s also a conservatism on the library side where collection size and collection budget are still regarded as the main ranking parameter.

Faculty and libraries are deeply committed, in ways they don’t often realize, to keep the current system alive. It reminds me of the famous quote by Daniel C. Dennett: “A scholar is just a library's way of making another library.”

There’s much more interesting stuff covered in the latest Poynder interview with Eisen, go and read it, I’m worried about a tl;dr blog post ;-).

Update:

A couple of days after I originally wrote this post in February, Elsevier withdrew its support for the RWA, see the press release from February 27: http://www.elsevier.com/wps/find/intro.cws_home/newmessagerwa

However, they also note the following: “[W]hile withdrawing support for the Research Works Act, we will continue to join with those many other nonprofit and commercial publishers and scholarly societies that oppose repeated efforts to extend mandates through legislation.” That'd be the FRPAA.

To be continued …

Can a private investment fund make a difference in higher education?

Just today, via TechCrunch, I came across the announcement of a 100 million dollar University Ventures fund. German media giant Bertelsmann is the lead investor, together with the University of Texas Investment Management Company (they manage UT’s endowment).

This is how they see their mission and strategy:
UV is an investment fund with over $100M in committed capital focused exclusively on the global higher education sector. UV pursues a differentiated strategy of innovation from within – partnering with (rather than competing against) traditional institutions.
By partnering with top-tier universities and colleges, and then strategically directing private capital to develop programs of exceptional quality that address major economic and social needs, UV expects to set new standards for student outcomes. Specifically, UV is committed to establishing data-driven programs that ensure superior student outcomes, as well as to leveraging technology to lower cost while improving access. All UV programs are student-centric, focused on student retention and completion.
I encourage you to read their full announcement, it’s quite interesting. But it raises some questions. UV estimates the global higher education market to be over 1 trillion (that’s a 1 with 12 zeros, a million times a million) dollar, so UV’s 100 million fund is 0.01 percent of that market. They seem to expect quite some leverage.

My biggest issue is with their strategy of working together with existing institutions. If there’s one thing I’ve learned about universities, it is that they are about the worst learning organizations I know (probably the only institution worse is the Catholic church). All investments in e-learning in the past 20 years or so haven’t made substantial changes in existing HE institutions. Technology has been added to otherwise unchanged curricula and educational practices, with no substantial gains in outcomes whatsoever.

In fact, I can think of only one example of a mildly successful innovation in HE educational practices in the Netherlands, and that example was established well before technology has started to not disrupt HE. In 1976, Maastricht University opened its doors. This university designed its curricula based on the philosophy of problem-based learning, and only hired staff that would adhere to the principles of problem-based learning.

My key takeaway from this is that if you want something really new in HE, you must at least start building a completely new program and design it from the ground up. I wonder if this can succeed when your strategy is to work together with existing institutions.

Nevertheless, it will be interesting to see how this develops.

E-learning, who’s in control?

Interesting story in yesterday’s New York Times about teachers (and also parents and students) in Idaho resisting the introduction of computers and online courses in high schools. In a sense it rehashes the ongoing debate (since the end of the 1990s) about e-learning.

One quote stands out for me: “Teachers are resisting, saying that they prefer to employ technology as it suits their own teaching methods and styles”. This perfectly illustrates Larry Cuban’s famous observation "When teachers adopt technological innovations, these changes typically maintain rather than alter existing class room practices." (Cuban 2001, p. 71).  Time and again we see that without redesigning courses the introduction of e-learning does not make much sense, but just adds costs, in terms of hardware and software licenses and teacher time, without producing better results. Why is it so hard for teachers to change their dominant practice of lecturing?

Also yesterday I came across this story about physics teachers that gave up on traditional lecturing because they found that students were not grasping fundamental concepts. Instead, they ask students to go over the material before meeting in class and posting questions in a learning management system that the teacher uses to prepare for class. In the classroom clickers are used to probe students understanding, as well as students discussing with each other, and, more importantly, learning from each other. The simple observation, in a related article is that a student who has just learned something might be better than an expert into explaining a new concept to a fellow student. As one of the physics teachers observes: “That''s the irony of becoming an expert in your field, Mazur says. "It becomes not easier to teach, it becomes harder to teach because you''re unaware of the conceptual difficulties of a beginning learner."

It’s not only important for teachers to understand their own limitations as experts, the hard part,I think, is also giving up control. I had an interesting experience in my own teaching career in the 1980s. One day I came to class unprepared and felt both bad and nervous about it. So I started asking students questions and letting them discuss among themselves solutions to those questions. It went wonderful, for the first time I had the feeling that students were really engaged and actively learning. Giving up control turned out to be fun as well.

Larry Cuban (2001), Oversold and Underused, Computers in the Classroom, Harvard University Press