Panlibus Blog

Thoughts on trusted source

Last month, I picked up Rupert Murdoch’s speech on the impact of the blogosphere on news media from Dan Gilmor’s Blog. When I went and read Murdoch’s speech given to the American Society of Newspaper Editors, where he challenged the audience – a “bunch of digital immigrants” – to make themselves relevant to a new world of “digital natives” it felt very apposite to our own domain.

You can read Rupert Murdoch’s full speech at:
and its well worth a read.

I hadn’t given a lot of thought to the similarities between the newspaper industry and the library domain before, but the question he posed to a room full of newspaper editors is exactly the same as the question we (the library domain) are asking ourselves.

We live in an age where younger consumers or “digital natives” who are internet-savvy feel that they have sufficient information skills to connect with all the information they need. They don’t believe that the information that they can get from a library or printed media has anymore authority or air of truth about it than what they can find themselves from searching on Google.

Speaking as a user who relies heavily on Google, A9 and other search engines to find information on a huge wealth of subjects, I’m not going to denigrate them. My world would be all the poorer, if I had to go back in time and rely again on my own willpower to happen on the right places to find the correct information, but this should not make internet users complacent.

I recognise that once upon a time, when trusted reliable sources of information took more than a click of a button to arrive at, I used to think very seriously about where I was going to find out information. I spent a long time thinking about who offered authority on a subject before going to that authority for guidance. Many of us now skip that process, because of the sheer volume of data that is made accessible to us. We rely on algorithms to conduct that filtering process for us – but is that wise?

The impact of this was brought home to me, when I listened to the Today Programme on BBC Radio 4 on Wednesday 11th May. The subject was Hepatitis B, which is an infection that kills more than a million people every year. In the UK despite having a vaccine approved by the British Medical Association, available to us free-of-charge, we have 180,000 people who are chronically infected. We also have information that is freely available on the internet making a causal link between the vaccine and Multiple Sclerosis which have raised public fears around the treatment of Hep B.

Obviously its impossible to prove there is a correlation between the advancement of this disease in societies like the UK and the growing use of the internet. But as Tom Fielden, the BBC Science Editor pointed out, there does need to be a greater understanding by members of the public who are using the internet as their trusted source for medical information that not “all information is good information” and that we have “got to get cleverer in how we use it”.

There are lots of lessons here I feel, suggestions have been made that we lack authoritative bodies to conduct and publish the results of investigations into issues of public medical concern – the MMR vaccine has also been cited as another example where the groundswell of opinion clearly obfuscated the medical recommendations of authorities like the Chief Medical Officer.

But leaving aside the publishers of trusted information for a moment, surely we are also saying that the information profession continues to have a role in acting as a conduit, directing users to trusted digital sources? That piece of the jigsaw was missed out in the BBC report unfortunately. Within various Web2.0 discussions we hear about “reputation systems” like the eBay “seller rating” model evolving. I would like to see reputation systems built and based on knowledge acquired and understood by the information profession.

Leave a Reply