Tuesday 17 December 2019

Ezproxy harvest now automated

Just in time for the Christmas break, I have got the harvesting of our ezproxy logs fully automated.

From now on, a shell script will run in the early hours of each morning on our ezproxy server.  This gathers the latest day's worth of log data, reformats it, and then loads it all into our MySQL database.

It means I can leave the process running without manual intervention (which had taken the form of weekly FTPs of log files and manual script-running), and start concentrating on working with colleagues to:

  • come up with a useful dashboard of commonly-needed information
  • devise some more challenging questions to pose about usage patterns with our electronic resources
I will save that fun until 2020.

It should also mean that I can start stepping back from the process, and leave colleagues with a useful new facility for looking at patterns of usage data that have always been unavailable to us in any usable form until now.


Thursday 12 December 2019

The 'Quickcount' - an easy tally of enquiries, with built--in reporting and graphs



A couple of years ago, we chose to let our contract lapse with the third-party who provided our enquiries logging system.

This outside system cost a lot of money, was poorly regarded by staff, and aimed to satisfy such a straightforward need that I thought I had to be able to write something that could do the same - but at no cost.

Also, the available reporting was very limited. There were no graphs, and staff could not ask different questions of the data without contacting the supplier to code up a new report.

Basically, the system needed to provide a simple counter that would allow staff on our desk to record the types of enquiry that came their way (e.g. 'Directional', 'Printer problems', 'Using the catalogue').

The answer I came up with used PHP and a MySQL database.

Using this in-house model, staff recorded queries with a rapid tap of a button on a web page.  This logged the details to a MySQL database.

From the database, we could report back on the enquiries received using a data visualization package of choice.

To begin with, we exported the data as *csv, and looked at it in Excel, but very quickly we saw the potential for creating real-time dashboards using FusionCharts, Google Charts, and latterly Microsoft Power BI.

The end product took a few enjoyable hours to develop, and provided us with a system that was quick and easy to use, and which provided as many statistics as we fancied.

You can try out a straightforward demonstration here:

https://www.sussex.ac.uk/library/quickcount/quickcount_demo.php

Feel free to tap away at the 'Category' buttons, then click to see either the real-time reports that use FusionCharts, or the delayed (but much more attractive) ones via Microsoft Power BI.

I would be happy to share the code for this with anyone who might have a use for a no-frills counter that provides real time visualizations.

Monday 18 November 2019

The Real World comes a-calling: five weeks of coding fun!

As a techie, I have to admit that when we put our new kiosks into place on the first and second floors, they looked rather excellent.

However, as often happens to techies, it didn't take long for the real world to point out what should have been glaringly obvious to me all summer.

As soon as people started to look at these fine kiosks, staff immediately started to comment on the fact that everyone had to incline their head to one side to work out where they were.

I had even been doing the same myself, without realising it.

The explanation was simple.  The floorplan graphics had all been drawn with our main entrance at the bottom of the picture.  This meant that, when viewed at a kiosk location on an upper floor, the map was orientated at 90 degrees to the standpoint of the user.

So, a subject that was shelved straight in front of someone appeared as if it should be off to their right.

My subconscious mind had must have been suppressing this blooper for at least a year, when it had been the first thing my daughter mentioned when I showed her the floorplans (don't you wish you had an interesting Dad like me?).  This time, I couldn't take refuge behind parental knowing-it-all.

I was going to have to rotate the floorplans.

Sounds easy, doesn't it?  That is certainly what everyone thought who suggested it to me.  'What don't you just ...'

However, to pull it off, I had to go back to beginnings with the first and second floor image files, and redraw new versions of the graphics.  This was because of my earlier innocence when converting the floorplans into vector graphics: which left me with no way to rotate the floor without all the text turning onto its side.

That set me off on five weeks of recoding two floors to reflect the real world.

I finished the job on Friday afternoon (it is now Monday morning), and still can hardly believe I am free of it.  The end results are what I should have aimed for from the beginning.  Now that I have mapped all the text and building landmarks as proper mathematical co-ordinates, I have plans that I can edit quickly and easily.

The end results are worth every second I spent on them.  I just wish I had known eighteen months ago how I could have avoided myself needing to do them.







Friday 15 November 2019

Ezproxy - walkthough final step - cool graphs!

Step Four: analyse the data and make whizzy graphs

All that remains is to point your reporting tool of choice at the MySQL database.

We have bee using Microsoft Power BI to quickly produce visualisations that depict usage of our electronic resources in a way that has never been possible before.



Ezproxy walkthrough Part Two

Step Three - Python script

I have put a slightly redacted version of the Python script at:
https://github.com/alfi1/ezproxy-harvest
The script goes through the output file I created previously from our log files, and for each line:

  • Makes an Alma API call that gets information about the user from our Library Management system
  • Extracts 'school', 'department', and 'course' from the API results
  • Converts the IP address of the requester to a geographic location (country name)
  • Writes out to a MySQL database:
  • School
  • timestamp
  • resource
  • country
  • usergroup
  • course
From a point of view of GDPR, note that the user name is discarded, and not sent to the database.  We only want a school/department/course affiliation, and nothing that could identify the individual.

Ezproxy harvesting - walk through of the steps so far

I am genuinely happy at the prospect of being able to analyse usage statistics for our electronic resources.  I have heard myself telling colleagues it was impossible for so many years that I feel ashamed that I never seriously tried to pull it off before.

My Python -> MySQL model is shaping up well.

Here is an outline of how the process works so far.  (This will all be automated at a later stage, but at the moment involves me taking the place of scheduled jobs).

Step One: get the Ezproxy logs

We host our own Ezproxy server, so I just FTP the most recent batch to a network drive that allows me to run Python.

The log files I need are named along the lines:
  • ezproxy.log.04Nov2019
  • ezproxy.log.05Nov2019
  • ezproxy.log.06Nov2019

Step Two: extract the details I need

From these huge logfiles, I only need a tiny subset of information:
  • IP address of the requester
  • User name of the requester
  • Timestamp
  • Which of our electronic resources they viewed
I do this at the command line, by going through the logs and cutting out what I need:

cat ezproxy*.log* | cut -d' ' -f1,3,4,7 |  grep 'connect?session' > ezproxy.out

(This basically retrieves columns 1, 3, 4, and 7 from the log file, from each line that shows the user authenticating their session)

With the user names redacted, the output looks like:



Step Three - run it through my Python script

Details in next post

Friday 8 November 2019

Kiosks - news flashes and advertising posters

I had coded the kiosk interface as a dynamic webpage: using Bootstrap to handle the formatting, and PHP to gather up the data from outside systems (the API of PC availability; the study rooms; the occupancy total).

While doing so, I hadn't put any thought into the possibility that anyone apart from me would ever want to make changes to the content.  As such, the backend setup was very programmatic, and editable only on a web server to which few of our staff have access.

I had basically created a setup that would leave me hostage to making each and every change!

That would have been fine if we were talking about interesting developments to the floorplans, or additional API calls: but, very quickly, colleagues were asking me to post latest-news style articles at short notice.

To allow me to retain control of  the code, while letting colleagues update the page, I came up with a simple trick of a web form that allowed colleagues to enter a headline and the text of a news flash.  Whatever they typed into the form was saved to a MySQL database, from which it was inserted live into the kiosk main page - with all the appropriate formatting applied.

This proved such a quick win (in sparing me from spade work) that I adapted the web-form->MySQL method to allow colleagues to add a digital poster to the display.  This permitted our Admin team to advertise topical events and news, without needing me, and leaving the code of the page safe from accidental change.

In the picture, you can see a poster advertising some events from our Digital Discovery week.  The news flash has been temporarily removed to make space for a scrolling carousel of book jackets that link through to our catalogue during Black History Month.

Tuesday 5 November 2019

Kiosks - showing free study rooms

The next widget I added used a direct SQL query to our room booking system, and showed whether there were any bookable study rooms free during  the current hour slot:


So, within a few weeks, our navigation kiosk had increased its repertoire.

Like a science fiction concierge, it now stood at our entrance; greeting everyone with a summary of how busy the building was at that moment, and offering directions.

Kiosks part 4 - the quick wins continue - PC availability

Seeing the positive response to the occupancy counter, I swiftly added a couple of extra widgets.

The first showed how many PCs were currently free for use:






This called to a Web Service created by our central IT Services, which monitored the logins across campus in real time.  With a bit of tweaking, it was quick enough to restrict this to machines in the Library.

Because of the kiosk's touch-screen, users could click on an area with free PCs, and see the location on the floorplan (complete with pulsing marker pin).

Kiosks: part 3

With the monolithic kiosk in place, it was immediately obviously that its potential would be sorely under-exploited if we left it dedicated to displaying our floorplans.

Once term got up and running, we imagined that users would become more familiar with the building layout, and have less use for the navigational aspects of the kiosk.

I had already been doing some preliminary exploration of creating an at-a-glance webpage that would give users a feeling for how busy the Library building was in real-time.  My thought was to create  a dashboard showing the current availability of services:

  • How full was the building?
  • How many PCs were free?
  • Whether there were any available study rooms
  • How many laptops were available for borrowing?

Access to our library involves climbing up several dozen steps: so I was planning a web page that would give people a sense of whether it was worth coming over and climbing up an Aztec temple's worth of stairs.

This sort of real-time information struck me as potentially useful on the kiosk.  True - I wasn't going to save anyone the steps - but I could make it immediately clear when they walked through the front door how busy they were likely to find our services.

I started to add my experimental features to the monolith.

First of all, I added a prominent bar that showed how many people were in the building.  This was something that had been available to staff for years, and relied on a simple call to our entry management system.


A quick win for me, as it reused existing code: this feature was welcomed straight away.

Friday 25 October 2019

Kiosks: Part 2

Time was pressing to get the floorplans launched, so to begin with, so I had no time to develop any groovy software for the kiosk.

For the beginning of term, I just kept things simple, and pointed the kiosk to the same floorplan interface as everything else.  I had been testing the web version all summer on different devices, so was pretty confident it would scale up or down to any screen size.


More to my surprise that anyone else's (my colleagues have a unanimously better grasp on human psychology than me), the kiosk proved an immediate hit.

Within days of powering it up, colleagues were making a feature of it during Library Welcome tours.

From Day One, I watched staff on our Information Hub walking users over to the kiosk to direct them around the building in response to directional queries.

And, all day long, I saw a steady stream of users tapping away at the screen.

It looked like I had a success on my hands.



When I had put put forward the purchase, I told colleagues that such a kiosk would allow us to explore a whole new world of features for digital signage and information sharing.

As soon as the rush of a new term had come under control, I knew I was going to have to make good on my promises, and find some quick-win uses for our new equipment ......

Navigation kiosks - Part 1


While I was finalising the launch version of our floorplans in 2018, I floated the idea of buying a touch screen kiosk to be located near the main entrance.

I thought it could greet people when they arrived, and give them an immediate way to find what needed in the Library.  I had long admired the interactive signage in shopping malls that allows you to swiftly see where you could get fast caffeine, or a route to the nearest Apple Store avoiding teenage clothing shops.

I have to admit, I made the suggestion with very little expectation that the budget might be approved.  I also saw the idea as a bit of a gimmick.  My primary intention was to put floorplans into the pocket of every user with a smartphone (assuming that would be every last one of them).  The kiosk seemed a bit gimmicky, but I thought it might help raise awareness of the new floorplan service.

To my surprise, the spend was almost immediately authorised, and just before the start of autumn term 2018, I found us with basically a six foot tall Android tablet:

Thursday 24 October 2019

Reassuring statistics

While preparing a report, I came up with a couple of very positive statistics on the take-up of our digital floorplans:

·         Comparing the period September-October 2017 (before the floorplans were developed) with the same period in 2019 (a year after their launch), there was a reduction in directional enquiries at the Information Hub of 55%.


·         Comparing the first weeks of term in 2018 and 2019, the floorplans experienced an increase in page hits of 268%

I know that stats don't prove causality, but these ones at least make me feel that the floorplans must have had a positive impact on users and staffed services.

Wednesday 16 October 2019

Ezproxy log harvesting: Padlet notes from my talk at EPUG


I outlined my model-in-progress at EPUG on 15/10/2019.

The Padlet on which I (very loosely) structured what I was is at:

https://padlet.com/alfi1/mtgzij7r0tqq

Electronic resources: getting informative usage statistics from our EZProxy logs using Python and Microsoft Power BI


Outline of intended model: September 2019

I started developing a process whereby relevant information could be extracted from the Ezproxy logs, augmented with information from other sources (a user’s school from Juno, and geographic location via IP address), and loaded into a MySQL database.
This enhanced data was then available for data visualisation in tools like Microsoft Power BI.
This was achieved by means of a Python script.

The model works as follows:
·         A UNIX shell script extracts the lines from the ezproxy log that identify a user authentication to a resource: gathering the user name, resource, ip address and timestamp.  [At the moment, I am doing this manually, but it can be automated for a production model].
·         A Python script works through the log extract. For each authentication, it calls the Alma API to get the user’s group, School, and course of study.
·         Python replaces the IP address with a geographical location.
·         Python writes the combined data to a MySQL database.
·         Visualisations can then easily be created from the MySQL database: using software of choice (Power BI in our case).
In terms of GDPR, the MySQL database will not hold anything that identifies an individual.

Tuesday 25 June 2019

Tutorial 2 (super-fast): using Bootstrap to make the images responsive to screen size

In order to make sure that my floorplan images are responsive to screen size, I used the Bootstrap framework.

You have to add a few lines to the header of your HTML file, and then you can tap a huge range of design shortcuts all masterminded by someone else.

For instance, to make my floorplan images resize for any screen, I just had to call the image as 'fluid':

<object type="image/svg+xml"  class="img-fluid" height="80%" preserveAspectRatio="xMidYMid meet"  data="http://users.sussex.ac.uk/~alfi1/tutorial/sample-floorplan.svg">
Try resizing my sample image here.

 Using the cleverness of Bootstrap, I saved myself a vast amount of time in all aspects of the design and prototyping, and the end product was much more professional visually than I could ever have created.

Tutorial 1: easy explanation of the trick at the heart of the floorplans

My whole floorplan model is based on one simple trick in SVG

My floorplans all rely on the fact that, in an SVG (Scalable Vector Graphics) file, you can change the image in real-time based on simple programming logic.  Importantly, you can add and hide elements from display.

I have put a simple example of a floorplan here to explain how this works.

I created my SVG file using Photoshop, and then opened it in a text editor (Notepad ++ in my case).

All the instructions needed to draw the image onto the screen are stored inside the file, between the tags <svg> and </svg>

So, I start adding instructions of my own.

First, I add a marker pin to indicate 'Collection A':

<g id="myPin" transform="rotate(330)">
<image x="55" y="120" width="150" height="150" xlink:href="pulsing-pin.svg" />
</g>

Then, I immediately use javascript to hide the pin:

document.getElementById("myPin").style.visibility = "hidden";

Then, also in javascript, I then look to see if there are any parameters appended to the URL.

This example works on the parameter 'show=a':

http://users.sussex.ac.uk/~alfi1/tutorial/sample-floorplan.svg?show=a

If the parameter is 'a', I simply unhide the marker pin inside the SVG:

var split_it_up = window.location.search.split('=');
var showMe = split_it_up[1];
// Collection A: unhide the pin if required
if (showMe == 'a') {
document.getElementById("myPin").style.visibility = "visible";
}
So, in this example, this form of the URL will not show a marker pin:

http://users.sussex.ac.uk/~alfi1/tutorial/sample-floorplan.svg

, but this one will:

http://users.sussex.ac.uk/~alfi1/tutorial/sample-floorplan.svg?show=a

Here is a quick screencast walkthough of the code for this tutorial.

The code for sample_floorplan.svg can be found on my Github.


Thursday 20 June 2019

Github - the core code

I have put the principal files onto Github:

https://github.com/alfi1/floorplans

These files contain the core functionality behind the system.

Because the code is written for all the University of Sussex classmarks, it will be overly long for any of you wanting to adapt it for your own institutions. It will also be full of repetition, and Sussex-specific bits.

Do not be put off!

I will add some tutorials on how the model works, and include some simplified examples to explain the basic process.

Basically, it is a straightforward model that I should be able to explain succinctly with example files.

Various prototypes: the cutting room floor!

In this version, I wanted to see what the classmarks looked like as solid blocks of colour:

http://wwwnewdev.sussex.ac.uk/library/tim/maps-mobile/librarymap_with_array_classmarkblocks.php?location=b_classmark

Here is a version in which I indicate the shelving bay (a la Stackmap):

http://wwwnewdev.sussex.ac.uk/library/tim/maps-mobile/librarymap_with_array.php?location=d_classmark

A student forum immediately dismissed this idea as overkill.  They told me they just needed guiding to the general classmark area, and they could use their own intelligence from that point to get to the book.

For a while, I wondered whether I could use Google Maps for the floorplan images.  Below is a mock up of how that might have looked:

http://wwwnewdev.sussex.ac.uk/library/tim/maps-mobile/librarymap.php?location=p_classmark

I had hopes that I could use GPRS to deliver the whole floorplan concept: with users seeing their current location using their phone's position.  Background reading and experimentation on the current state of geolocation inside buildings persuaded me this was an aspiration to postpone for the still-distant future.



Tuesday 18 June 2019

My Powerpoint from Holloway

Speaker's notes and all.

In the interests of transparency, you are welcome behind the curtain:

https://sussex.box.com/s/2fx47bcy4wa8h1d5au9qge7lfoy1wpcz

Where to find the floorplans?

Couldn't be easier:

http://www.sussex.ac.uk/library/floorplans/

Why a blog?

In the first instance, I will use this blog as a way to share my code for our digital floorplans.

At the UX In Libraries conference in Royal Holloway, there was interest from other sites in adapting this model for their own use.

This will give me a place to record what I did, and advise other people on how to carry it forwards for themselves.

A run through of my presentation from UXLibs 2019

A dry run of my presentation from Holloway, on the design process of our digital floorplans.

It lacks the improvised live quality, but is the closest thing to a repeat you will get:

https://drive.google.com/file/d/1p0jT0g_jQ37ssJGjF64VLhDvNRkWJlSe/view