Friday 25 October 2019

Kiosks: Part 2

Time was pressing to get the floorplans launched, so to begin with, so I had no time to develop any groovy software for the kiosk.

For the beginning of term, I just kept things simple, and pointed the kiosk to the same floorplan interface as everything else.  I had been testing the web version all summer on different devices, so was pretty confident it would scale up or down to any screen size.


More to my surprise that anyone else's (my colleagues have a unanimously better grasp on human psychology than me), the kiosk proved an immediate hit.

Within days of powering it up, colleagues were making a feature of it during Library Welcome tours.

From Day One, I watched staff on our Information Hub walking users over to the kiosk to direct them around the building in response to directional queries.

And, all day long, I saw a steady stream of users tapping away at the screen.

It looked like I had a success on my hands.



When I had put put forward the purchase, I told colleagues that such a kiosk would allow us to explore a whole new world of features for digital signage and information sharing.

As soon as the rush of a new term had come under control, I knew I was going to have to make good on my promises, and find some quick-win uses for our new equipment ......

Navigation kiosks - Part 1


While I was finalising the launch version of our floorplans in 2018, I floated the idea of buying a touch screen kiosk to be located near the main entrance.

I thought it could greet people when they arrived, and give them an immediate way to find what needed in the Library.  I had long admired the interactive signage in shopping malls that allows you to swiftly see where you could get fast caffeine, or a route to the nearest Apple Store avoiding teenage clothing shops.

I have to admit, I made the suggestion with very little expectation that the budget might be approved.  I also saw the idea as a bit of a gimmick.  My primary intention was to put floorplans into the pocket of every user with a smartphone (assuming that would be every last one of them).  The kiosk seemed a bit gimmicky, but I thought it might help raise awareness of the new floorplan service.

To my surprise, the spend was almost immediately authorised, and just before the start of autumn term 2018, I found us with basically a six foot tall Android tablet:

Thursday 24 October 2019

Reassuring statistics

While preparing a report, I came up with a couple of very positive statistics on the take-up of our digital floorplans:

·         Comparing the period September-October 2017 (before the floorplans were developed) with the same period in 2019 (a year after their launch), there was a reduction in directional enquiries at the Information Hub of 55%.


·         Comparing the first weeks of term in 2018 and 2019, the floorplans experienced an increase in page hits of 268%

I know that stats don't prove causality, but these ones at least make me feel that the floorplans must have had a positive impact on users and staffed services.

Wednesday 16 October 2019

Ezproxy log harvesting: Padlet notes from my talk at EPUG


I outlined my model-in-progress at EPUG on 15/10/2019.

The Padlet on which I (very loosely) structured what I was is at:

https://padlet.com/alfi1/mtgzij7r0tqq

Electronic resources: getting informative usage statistics from our EZProxy logs using Python and Microsoft Power BI


Outline of intended model: September 2019

I started developing a process whereby relevant information could be extracted from the Ezproxy logs, augmented with information from other sources (a user’s school from Juno, and geographic location via IP address), and loaded into a MySQL database.
This enhanced data was then available for data visualisation in tools like Microsoft Power BI.
This was achieved by means of a Python script.

The model works as follows:
·         A UNIX shell script extracts the lines from the ezproxy log that identify a user authentication to a resource: gathering the user name, resource, ip address and timestamp.  [At the moment, I am doing this manually, but it can be automated for a production model].
·         A Python script works through the log extract. For each authentication, it calls the Alma API to get the user’s group, School, and course of study.
·         Python replaces the IP address with a geographical location.
·         Python writes the combined data to a MySQL database.
·         Visualisations can then easily be created from the MySQL database: using software of choice (Power BI in our case).
In terms of GDPR, the MySQL database will not hold anything that identifies an individual.