Panlibus Blog

Archive for the 'Service Oriented Architecture' Category

Opening the Walls of the Library – SOA & web services

It doesn’t happen often, but it is really nice when when you receive something produced for one purpose to find that it has been produced so well that it is good for so much more.  Let me explain….

My colleague Andy Latham has been pulling together a white paper Opening the Walls of the Library – SOA and web services at Talis[pdf].  It’s main purpose is to support the marketing effort behind Talis Keystone, our SOA platform that underpins Talis Library Integration Services.  To help explain those services, to the not necessarily technical people in library and other departments considering integration, he needed to explore the history, principles, and practical considerations of this approach.  It is in this explanation, I believe that he has produced a document that is a great introduction to the application of SOA and library web services in general.

Because of it’s original purpose, and the fact that for obvious reasons the examples and case studies come from Talis products and customers, the document could be considered by some as being a bit marketingy.  Nevertheless, if you want an overview of real-world issues (many of which are to do with people not technology), or business models, or web service functions, or why choose REST in favour of SOAP, in library SOA I can recommend this White Paper as an informative easy way in.

As Andy says in the conclusion:

SOA is not all about technology; SOA is a business journey that needs to follow a path with small commercial and technical steps towards a known vision of business maturity. Commercial and Open Source technology has paved a way for businesses to begin introducing an SOA strategy. Introducing an SOA strategy is as much of a technical challenge as it is an operational challenge as the technology will break down silos between teams, departments and organisations and conflicting business processes which worked well in the silo will need to be redeveloped to meet the new needs of the more agile business.

The release of the OLE’s report, which I commented upon previously, plus vendor initiatives such as OCLC’s Web Services and Ex Libris’ URM, have served to raise the prominence of web services in the world of libraries.  On a recent Library 2.0 Gang show about the OLE project it was clear, in the discussions between Andy, OLE’s Tim McGeary, Marshall Breeding and Ex Libris’ Oren Beit-Arie, that there is much more to integration than just technology.

I think it is fair to say that Libraries as a sector have not been at the leading edge of the SOA/web services debate.  It is also fair to say that for whatever reason the UK seems to been a few years ahead of some areas in reaping the benefits of such integration in libraries.  As Andy’s document shows, there is the potential for significant financial and organisational benefits when undertaking integration in this way.

“The 25,000 students at one of the largest Universities in the UK are now able to pay their library charges online using either debit or credit cards, enabling further efficiency savings for library staff and improving student services.”

“Getting relevant information from Voyager into personalised portal sites has been a key requirement for the University for some time…..  By building a SharePoint integration we are maximising the positive impact of our new VLE and enhancing elements of the Library service.”

“The University of Salford is in the process of transforming the way that the identities of its entire user population are managed across all key systems in the organisation. An essential part of the solution employed (using Sun Microsystems’ IdM suite) is the transition and management of up to 23,000 Talis LMS borrower identities via Talis Keystone.”

To reap these sort of benefits in a sustainable way a library has to be aware of, and have, a SOA strategy.  There is much in this white paper that can help those new to the subject to understand the issues.  As someone who thinks he knows about these things, I also found it very useful for checking and clarifying my assumptions.

So as I say, a recommended read….

OLE – $5.2m to get from Diagrams to an ILS Replacement in two Years

The OLE Project I’m currently reading my way through the final draft of the OLE Project Final Report.  The one year Mellon Funded Open Library Environment (OLE) project which “convened a multi-national group of libraries to analyze library business processes and to define a next-generation library technology platform

the project planners produced an OLE design framework that embeds libraries directly in the key processes of scholarship generation, knowledge management, teaching and learning by utilizing existing enterprise systems where appropriate and by delivering new services built on connections between the library’s business systems and other technology systems.

We at Talis, along with some 200 other organisations, participated in the process by feeding back our experiences in implementing live integrations between Library Management Systems and other institutional entities that the report authors recognise as being key to delivering a seamless workflow.  Our experience indicated that successful integration between systems is as much to do with local departmental motivations, understanding, and politics as it is to do with technology. This was discussed in more depth on the March Library 2.0 Gang Show with Tim McGeary from the OLE project and Talis’ Andy Latham  were guests.

The body of the report consists of many process model diagrams, describing the required interactions between library and other processes/components, which when brought together will enable the construction of library associated workflows for the next-generation library service that will utilise this next-generation library technology platform.

This first year project is in it’s own terms a success “The OLE Project met all of its objectives and was completed on time and within budget”.  One cannot deny the thought, effort, commitment and enthusiasm that has gone in to the production of this report.   Without rerunning the analysis they undertook, it would be difficult to criticise the model they have described.  The proof of the pudding of course will come in the next phase, when they move on from describing a new technology platform to start building it.

The planning phase of this project is complete. The next steps are to identify a group of build partners to provide investment funds and to develop and test the initial software. A build partner  can be an individual library, a consortium or a vendor.

The total partnership cost of the OLE Project over two years is projected to be $5.2 million, a figure that includes all programming effort as well as project management and quality assurance staffing. In addition to OLE Project costs, costs of participation would include some local staff, governance and travel funding. Project partners intend to contribute half of the OLE partnership costs and seek the other half from The Andrew W. Mellon Foundation.

ole diag Viewing the process diagrams in the report takes me back to 1990, in a snow covered hut in the grounds of the University of Birmingham.  I shared that hut for several weeks with Talis (then BLCMP) staff and a group of folks from a Dutch library system vendor (long since subsumed in to the OCLC global organisation) with the objective of designing the next-generation library technology platform.  Several years, and a few £ million in investment, led to the development a very successful library system from which the current Talis Library System, Alto, has since evolved.

There are many parallels between that 1990 development process and the road that OLE are about to embark upon, if their bids for continued funding are successful.  Not only that BLCMP was a library cooperative during that period, but also that we had the luxury of being able to step back from previous systems and start with a clean set of library process requirements. 

I wish the OLE project continued success.  Whatever achieved, I believe the exercise they are undertaking is massively valuable to the whole library domain. 

Will they be able to translate their clean [uncluttered by interaction issues with systems over which they have little influence, or uncoloured by local institutional inter-departmental politics, and ‘traditional practices] diagrams in to an installable, manageable, collection of components suitable to deliver format agnostic library services? – possibly.  Will they be able to do it in 2 years for a mere $5.2? – Experience tells me to be a little more sceptical on that last point.

Integrated library management systems: what we need

blcmpAs part of the “Shock of the New” strand at the UK Umbrella conference this year, Lucy Tedd from Aberystwyth University led a session entitled “Integrated library management systems: what we need”. Attendance of this session turned out to be very supplier-heavy, and I’m not sure that’s what she anticipated. I was moderately surprised too, but thinking about it afterwards, I felt that the lack of interest from practitioners was reflective of the growing irrelevance of the traditional library management system (or ILS if you’re North American) to the needs of the modern library, particularly in academia.

It’s not that the library technology landscape has stood still, of course. Lucy was able to list quite a few innovative products– from the now-established Aquabrowser to Talis’ own Aspire resource list tool – a great product that we’re all very proud of here. But taking one step back and looking at what the library has to deliver in 2009, the library technology marketplace as a whole is failing to keep up with the pace of change.

Lucy Tedd highlighted some of the key developments of this decade. Some of them, though – such as the consolidation of the library technology marketplace with mergers, acquisitions and the increasing intervention of venture capitalists in the businesses of existing suppliers – may be symptomatic of underlying trends rather than drivers.

I felt that to get a firmer grip on the fundamental shifts in our world, I had to refer back to a session I saw last month at the annual SCONUL conference, given by Marshall Breeding (a member of Talis’Library 2.0 Gang). For the uninitiated, Marshall Breeding is an American library technology guru, author of an ongoing series of library technology guides. Where he wins out over other commentators such as Lucy Tedd is his ability to look behind headline trends, take them apart, examine the implications and project them forward. So although both Tedd and Breeding identify industry consolidation as a key trend, Breeding will go on to alert us to the disruptive impact that this has on product development, and the adverse effect this has on the lead time that libraries have to plan for a product enhancement.

Marshall Breeding hears a lot of frustration with LMS products and vendors, and is adamant that systems are not keeping up with the pace of change in libraries. Innovation, then, is falling below expectations, and Marshall reports that many US libraries are unhappy with the current state of affairs. He admitted that he wasn’t so sure about UK libraries, but following the group activity at the end of Lucy Tedd’s session, I’m quite clear that the mood here is similar to that of the US. In my group there was one librarian from Open University and one from University of Hertfordshire. Each group was asked to identify its most pressing requirement of the LMS. Both librarians agreed that the inadequacy of the LMS in managing e-resources was the biggest problem in an era in which the issuing of books is no longer the primary activity.

Marshall Breeding described the conventional LMS as untenable, now that a whole series of products required to manage fundamental library processes – such as ERM systems and knowledgebases – are located outside the LMS. In the electronic era, circulation becomes fulfilment, cataloguing is no longer MARC-centred, for example. So as the traditional modules of the LMS become less important, we need to think more in terms of SOA (Service-Oriented Architecture) – dividing functionality into small chunks that can be fitted together for multifarious purposes (a shift that my colleague Richard Wallis identified back in 2007 on this blog). This is very much the thinking of the OLE (Open Library Environment) Project, of which Marshall Breeding is a proponent.

But it’s not just a back-office problem, of course. The library OPAC, traditionally another module LMS, also suffers from the same problem, in failing to reflect the eJournals and digital objects that libraries spend so much money on. Breeding did identify further issues with library OPACs, highlighting their clunky interfaces, poor eCommerce facilities, and more worryingly, relatively weak search engines and poor relevancy ranking.

Open Source has, in the context of these difficulties, generated a lot of interest, though more in the US at present. However Breeding pointed out that Open Source offerings currently rank middle to low in terms of customer satisfaction, and the only libraries that are interested are the ones that are already doing it. There is no groundswell of interest, despite the pockets of evangelistic fervour.

Marshall Breeding also turned his attention to Web 2.0 tools, and argued persuasively against the tendency to adopt disparate tools without a broader strategy in place, which has the effect of “jettisoning library users away from our websites”. Instead, he says, Web 2.0 capabilities need to be built into the guts of our systems. I’m assuming here that he doesn’t mean library vendors reinventing social networking tools in a creepy treehouse kind of way, and that instead he’s advocating seamless integration with applications such as the VLE and Web 2.0 tools such as Twitter. Incidentally, Richard Wallis has recently been demonstrating a Juice extension enabling integration between Twitter and the OPAC.

Breeding looks forward to a future in which the library can offer a single point of access to the inside of all the eJournals that the library subscribes to. Scale is not the issue, he argues, and cites OCLC’s Lorcan Dempsey as pointing out that the whole of WorldCat will now fit on an iPod. Instead we should be looking at what the world outside the library is doing – searching the deep content directly, and identifying and examining the tools that people are using to do this. In this way, it becomes clear that the likes of Google Scholar, Amazon, Waterstones and ask.com are the competitors of the library in the 21st century, and it is incumbent upon the vendor community to help libraries with that gargantuan challenge if they are to survive.

The Library 2.0 Gang – the vendors view on OCLC Web-scale

On the Library 2.0 Gang back in May we discussed Cloud Computing, an architecture in which you use your web browser to access your services on computers hosted by your system provider. 

Unlike traditional hosting, where you would expect to identify which system is running your application, cloud services appear as one big application servicing everyone’s needs spread across many computers and often data centres spread around the Internet.  The conversation last month was prompted by OCLC’s announcement they are developing such a service for delivering library services such as circulation, acquisitions, and license management.  The introduction of library services from the cloud, in a market where the vast majority of libraries host their own systems, could be potentially game changing and we speculated on what the reaction of the current suppliers would be.

In an attempt to answer some of that speculation I brought together a gang for the June show consisting of representatives of some of those suppliers – Carl Grant from Ex Libris, Nicole Engard from LibLime who support the Open Source system Koha, and Rob Styles from Talis.  We were joined by a new guest to the show Boris Zetterlund from Scandinavian and now UK supplier Axiell.

Technical issues, potential costs, applicability for smaller libraries, and openness of data & APIs all got an airing in this interesting conversation – have a listen.

What ‘is’ Web-Scale?

Cloud%003F It will have been difficult to miss OCLC’s recent Cloud Computing announcement.  If you have, the headline is that they say they are building an architecture capable of handling all the transactions of all libraries, meaning that they can add circulation, acquisitions, license management and several other aspects of library management to their WorldCat shared discovery capability.

As you can imagine, all this built upon racks of computers hosted at OCLC’s data centre combining their power to deliver a service to many users at the same time.  A well proven technology as used by Google, Sun, Amazon, Salesforce.com, and even here at Talis where the Talis Platform underpins our Engage, Aspire, and Prism products. The rest of the computing world describes this as Software as a Service (SaaS) or Cloud Computing.

For some reason OCLC are determined to come up with their own term – Web-Scale.  OCLC’s Andrew Pace in his recently published Talking with Talis podcast [highly recommended if you want an insight in to this initiative] tries to explain why the library world needs such a term.  The inaugural post on the OCLC Engineering blog, by Mike Teets also goes in to much depth as to what Web-Scale is.

Having read and listen to all this I must admit I’m still unconvinced.  It still sounds like engineering has be brought in, to support the marketing folks’ desire to be different, with technical description.  There are enough confusing terms hijacked by marketeers in the computing and Internet worlds. So I’m sure OCLC will forgive me if I continue to describe their approach as a cloud based software as a service – Cloud Computing.

OCLC’s Andrew Pace Talks with Talis about Web-Scale ILS

andrew_pace To find out about OCLC’s move in to providing hosted, Web-scale, Software as a Service functionality for managing libraries, who better to ask than the person responsible for the programme.

Andrew Pace, Executive Director, Networked Library Services has been working on this for the last fifteen months, and as you can hear from our conversation is pleased that he can now talk openly about it.

Our wide ranging conversation takes us from the epiphany moment when Andrew announced he wanted to be a librarian through to the strategic, and architectural decisions behind this significant OCLC initiative.  

Andrew’s answers to my questions add depth and background to the brief details so far released in his blog posts and OCLC’s press releases.

The Library 2.0 Gang on Cloud Computing Libraries and OCLC

L2Gbanner144-plain OCLCcloudsAs commented previously OCLC recently announced a bold move in to providing hosted, Web-scale, Software as a Service functionality for managing libraries.  Joining other library cloud computing initiatives, such a SerialsSolutions Summon product, and Talis Prism, this is the first to venture in to the realms of circulation, and acquisitions.    

In light of this and at a time when Cloud Computing is gaining acceptance in the wider Internet and computing worlds, now was a great time for the Gang to focus their thoughts and comments on the topic.

In this month’s Library 2.0 Gang, new Gang member Frances Haugen from Google, joined Marshall Breeding and our guest Dr Paul Miller to explore what Cloud Computing is, and to speculate what the OCLC announcement should be viewed.

OCLC Take aim at the library automation market from the Cloud

OCLCclouds Over the last few years OCLC the US based not –for-profit cataloguing cooperative has been acquiring many for-profit organisations from the world of library automation such as PICA, Fretwell-Downing Informatics, and Sisis Information Systems. 

About fifteen months ago, Andrew Pace joined OCLC, from North Carolina State University Libraries, and was given the title of Executive Director, Networked Library Services.  After joining OCLC Andrew, who had a reputation for promoting change in the library technology sphere, almost disappeared from the radar.  

Putting these two things together, it was clear that the folks from Dublin were up to something beyond just owning a few non-US ILS vendors.

From a recent post on Andrew’s Hectic Pace blog, and press releases from OCLC themselves, we now know what that something was.  It is actually a few separate things, but the overall  approach is to deliver the functionality, traditionally provided by the ILS vendors (Innovative, SirsiDynix, Polaris, Ex Libris, etc., etc.), as services from OCLC’s data centres.   This moves the OCLC reach beyond cataloguing in to the realms of acquisitions, license management, and even circulation.

The idea of braking up the monolithic ILS (or LMS as UK libraries refer to it) is not a new one – as followers of Panlibus will know. Equally, delivering functionality as Software-as-a-Service (SaaS) has been native to the Talis Platform since its inception.  It is this that underpins already established SaaS applications Talis Prism, Talis Aspire and Talis Engage.

Both OCLC, with WorldCat Local, and Talis with Prism have been delivering public discovery interfaces (OPACs) as SaaS applications for a while now, ‡biblios.net have recently launched their social cataloguing as a service [check out the podcast with Josh Ferraro], but I think this is the first significant announcement of circulation as a service that I have been aware of.

The move to Cloud Computing, with it’s obvious benefits of economies of scale and the removal of need for libraries to be machine minders and data centre operators, is a reflection a much wider computing industry trend.  The increasing customer base of Salesforce.com, the number of organisations letting Google take care of their email, and even their whole office operation (such as the Guardian) are testament to this trend.  So the sales pitch from OCLC, and others including ourselves here at Talis, about the total cost of ownership benefits of a Cloud Computing approach are supported and validated industry wide.

So as a long time predictor of computing transforming from a set of locally managed and hosted applications to services delivered as utilities from the cloud, mirroring the same transformation for electricity generation and supply from a century ago,  I welcome this initiative by OCLC.   That’s not to say that I don’t have reservations. I do. 

The rhetoric emanating from OCLC in these announcements is reminiscent of the language of the traditional ILS vendors who are probably very concerned by this new and different encroachment on to their market place.  There is an assumption that if you get your OPAC from WorldCat (and as a FirstSearch subscriber, with this on the surface ‘free offer’,  you are probably thinking that way), you will get circulation and cataloguing and all the rest from a single supplier – OCLC.

The question that comes to mind, as with all ILS systems, is will you be able to mix and match different modules (or in this case services) from different suppliers, so that libraries can have the choice of what is best for them.  Will OCLC open up the protocols (or to be technical for a moment, the hopefully RESTful APIs) to access these application/service modules so that they can not only be used with other OCLC services but with services/applications from Open Source and other commercial vendors.  Will they take note of, or even adopt, the recommendations that will come from the OLE group [discussed in last month’s Library 2.0 Gang], that should lead towards such choice.

Some have also expressed concern that a library going down the OCLC cloud services route, will be exposing themselves to the risk of ceding to OCLC control of how all their data is used and shared, not just the bibliographic data that has been at the centre of the recent storm about record reuse policies.  Against that background, one can but wonder what OCLC’s reaction to a library’s request to openly share circulation statistics from the use of their OCLC hosted circulation service would be.  

This announcement brings to the surface many thoughts, issues, concerns and technological benefits and questions, that will no doubt rattle around the library podcasting and blogosphere for many months to come.  I also expect that in the board rooms of the the well known commercial [buy our ILS and a machine to run it on] providers, there will be many searching questions being asked about how they deal with the 500lb [not-for-profit] gorilla that has just moved from the corner of the room to start dining from their [for profit] table.

This will be really interesting to watch…..

The composite image was created using pictures published on Flickr by webhamser and Crystl.

Open Library Environment Project – is SOA right?

The OLE Project OLE – The Open Library Environment Project has been around for about a year now, and I am guilty of not monitoring as closely as I would have liked to.  So the opportunity to listen to their recent webcast seemed a great way to get up to speed again. 

Following the instructions on the OLE Project site to replay the webcast, led me to one of the most unusual webcast playback experience I’ve had for a while.   To see the slides you have to click through to a service run by Adobe Acrobat, which provides a good representation of the webcast environment, complete with chat traffic in real time.  The problem then is that you have to use the telephone system to get the audio.  This is not a cheap exercise for those of us having to dial international – at least with Skype Out you can keep the costs down a bit.  Synchronising the listening with the viewing is then a bit of a challenge, especially if you have to pause and restart.

Anyway enough about the experience – what about the content?

What is clear is that the Mellon Funded Project has got a great deal of attention and significant partners from academic and national libraries.  They also have a challenging and worthy goal, which they are taking significant early steps towards:

“By the end of our project, we will have a design for a next-generation library system using Service Oriented Architecture. We also will have built a community of interest that can be tapped to help build the OLE framework.”

The webcast inevitably, especially in the QA section, swung between the low-level detail, the strategic approach, and things like privacy which are more the policy concern of potential implementing libraries than the project itself.

Having listened to it, it is clear that they are working on an assumption that implementing libraries would have to throw their current investment in commercial or open source systems away and build all this from scratch – this being based on experience with the current generation of systems not being capable of integrating easily, or not  dealing with electronic resources.  That is a heck of a large chunk to bite off, even if you pull in things like circulation and cataloguing from other projects.

Experience also calls me to strongly question the emphasis on Service Oriented Architecture (SOA), that is if SOA is being used as generally understood as against a generic term for systems being connected via web-style calls.

A bit of background on that ‘experience’ I mention – There are [in general terms] two approaches to Web Services – tightly coupled SOA, and loosely coupled REST based services.  The difference being that a SOA developer/integrator trying to embed the service in to their application needs access to web service descriptions and other enterprise integration tools. Whereas in the RESTful world, integration calls can often be tested using a web browser, and integrators/developers need no more development tools than they currently use.

Both SOA & REST have their benefits and their, sometimes religious, proponents.  With our first use of SOAP (the underlying messaging protocol for SOA) back in the late 1990’s I have been using both of these competing approaches for some time.  Talis over that time has developed and rolled out and established a significant user community for a product known as Talis Keystone.  Keystone is a web service integration component designed to enable external enterprise services (Student Registries, Finance Systems, Student Portals, e-payment services, CRM systems, etc.) to easily and reliably integrate library system data and functionality into their workflow. 

Keystone is now in use in many Talis customer libraries, and with some from libraries with a system from another vendor, in the UK.  Successful integrations have been completed with products such as: Aggresso, Civica, Oracle, and SAP finance systems; Microsoft Sharepoint, uPortal, Moodle, and Blackboard learning and portal environments; and WorldPay e-payment services.  Integration with systems from other suppliers are already in the pipeline.

From day one, Talis Keystone has had the capability to support both SOA and RESTful integration. It maybe useful for projects such as OLE to reflect on the experience in rolling out these integrations, and the take-up of the REST and SOA options.   The vast majority of these integrations have taken the RESTful approach, with only one or two going for SOA.  There are many reasons for this, but they all fall under the heading of there being a much lower barrier to implementing REST than SOAP.  Pragmatically I am of the opinion that lack of SOA capability would not have prevented any of these integrations taking place, whereas if SOA was the the only choice many would not have been undertaken at all. 

I/We would be more than happy to share some of these experiences in implementing and rolling out a product that addresses many of the concerns of the OLE Project.

Technorati Tags: ,,,,,

Why Nodalities?

I read the Panlibus blog – I note Talis has another house blog called Nodalities – why is this and why/who should be reading it??”

One of the major recurring themes from myself and others in Panlibus postings is Library 2.0 and its more general cousin Web 2.0. If you followed the links I provided to their descriptions in Wikipedia you will have discovered that they are both labels for a collection of attributes as against specifications.

I have yet to read a complete concise definition of what Web 2.0 or Library 2.0 ‘is’ [and probably never will], nevertheless it is far simper to look at an application or service and pronounce to the world that it is very Web 2.0 and be fairly confident that people will understand what you mean.

Web 2.0 is virtually all about technology, Web Services, Service Oriented Architecture, Social Networking tools, etc. etc., whereas it’s Library relative mixes all of that with a heavy dose of using those Web 2.0 tools and the customer handling & social skills of the library community to provide a better service to library users. – Debates about the use of mobile phones, and the provision of coffee, in a Library environment are often found in the Library 2.0 world.

We at Talis are the ‘Technology Guys’ in the Library equation, and although interested in all that is debated, our motivations are all about how new and emerging technologies [currently labelled Web 2.0] can be beneficially applied in the Library world. To this end you will find me and my colleagues evangelising on the subject both here and at conferences around the world such as these: Access2006, Internet Librarian International, Stellenbosch Symposium, Internet Librarian 2006, and the Charleston Conference.

The Talis Platform is an excellent example of applying Web 2.0, Semantic Web [to mention another ‘label’], SOA, and other technologies to provide innovative solutions to the liberating of library data, functionality, and services for the benefit of all.

In the process of proposing and delivering those [currently library specific] solutions, we are pushing both the theoretical and practical boundaries of web technologies and the theories and standards that are behind them – especially in the World Wide Web Consortium where you find Talis involved with several comittees. In doing this we are very active members, with much to contribute and say, of the world community driving forward these technologies.

This is where Nodalities comes in. You will note [today] that there is a posting from me picking up points from the blogs of Ian Davis and Sam Tunnicliffe, from our Platform Team, who are currently at the Web 2.0 Summit in San Francisco. If you are interested, like I am, in the way that all things Web are [and are being predicted to be] moving, you will find what they are reporting most engrossing.

Reading between the lines of what is being presented it is clear that the advances already being demonstrated by the Talis Platform are only the first step in a massive change in the way large sets of data and metadata (often only linked by semantics), can be marshalled, related together, and combined to change the way information is used in the future.

Dependant on the context, you will find Talis people attending and/or speaking at both Library and more general conferences across the world. Our knowledge, and understanding, of the issues surrounding the library and information industries is very valuable input into the wider technology world. As we have demonstrated this is a two way street. It is absolutely certain that our knowledge and understanding of the Web 2.0 world is already adding unique value to the world of libraries.

So to answer the question at the start of this posting…..

If you are in the library community and want to keep abreast of technology advancements – read Panlibus. If you are in the wider web community and are interested in what we are doing, and have to say about, applying these technologies as a Platform in real world situations – read Nodalities. I suspect most people, although with concentration on one, will find postings of interest in both Panlibus and Nodalities.

Technorati Tags: , , , , , ,