Monday 18 November 2019

The Real World comes a-calling: five weeks of coding fun!

As a techie, I have to admit that when we put our new kiosks into place on the first and second floors, they looked rather excellent.

However, as often happens to techies, it didn't take long for the real world to point out what should have been glaringly obvious to me all summer.

As soon as people started to look at these fine kiosks, staff immediately started to comment on the fact that everyone had to incline their head to one side to work out where they were.

I had even been doing the same myself, without realising it.

The explanation was simple.  The floorplan graphics had all been drawn with our main entrance at the bottom of the picture.  This meant that, when viewed at a kiosk location on an upper floor, the map was orientated at 90 degrees to the standpoint of the user.

So, a subject that was shelved straight in front of someone appeared as if it should be off to their right.

My subconscious mind had must have been suppressing this blooper for at least a year, when it had been the first thing my daughter mentioned when I showed her the floorplans (don't you wish you had an interesting Dad like me?).  This time, I couldn't take refuge behind parental knowing-it-all.

I was going to have to rotate the floorplans.

Sounds easy, doesn't it?  That is certainly what everyone thought who suggested it to me.  'What don't you just ...'

However, to pull it off, I had to go back to beginnings with the first and second floor image files, and redraw new versions of the graphics.  This was because of my earlier innocence when converting the floorplans into vector graphics: which left me with no way to rotate the floor without all the text turning onto its side.

That set me off on five weeks of recoding two floors to reflect the real world.

I finished the job on Friday afternoon (it is now Monday morning), and still can hardly believe I am free of it.  The end results are what I should have aimed for from the beginning.  Now that I have mapped all the text and building landmarks as proper mathematical co-ordinates, I have plans that I can edit quickly and easily.

The end results are worth every second I spent on them.  I just wish I had known eighteen months ago how I could have avoided myself needing to do them.







Friday 15 November 2019

Ezproxy - walkthough final step - cool graphs!

Step Four: analyse the data and make whizzy graphs

All that remains is to point your reporting tool of choice at the MySQL database.

We have bee using Microsoft Power BI to quickly produce visualisations that depict usage of our electronic resources in a way that has never been possible before.



Ezproxy walkthrough Part Two

Step Three - Python script

I have put a slightly redacted version of the Python script at:
https://github.com/alfi1/ezproxy-harvest
The script goes through the output file I created previously from our log files, and for each line:

  • Makes an Alma API call that gets information about the user from our Library Management system
  • Extracts 'school', 'department', and 'course' from the API results
  • Converts the IP address of the requester to a geographic location (country name)
  • Writes out to a MySQL database:
  • School
  • timestamp
  • resource
  • country
  • usergroup
  • course
From a point of view of GDPR, note that the user name is discarded, and not sent to the database.  We only want a school/department/course affiliation, and nothing that could identify the individual.

Ezproxy harvesting - walk through of the steps so far

I am genuinely happy at the prospect of being able to analyse usage statistics for our electronic resources.  I have heard myself telling colleagues it was impossible for so many years that I feel ashamed that I never seriously tried to pull it off before.

My Python -> MySQL model is shaping up well.

Here is an outline of how the process works so far.  (This will all be automated at a later stage, but at the moment involves me taking the place of scheduled jobs).

Step One: get the Ezproxy logs

We host our own Ezproxy server, so I just FTP the most recent batch to a network drive that allows me to run Python.

The log files I need are named along the lines:
  • ezproxy.log.04Nov2019
  • ezproxy.log.05Nov2019
  • ezproxy.log.06Nov2019

Step Two: extract the details I need

From these huge logfiles, I only need a tiny subset of information:
  • IP address of the requester
  • User name of the requester
  • Timestamp
  • Which of our electronic resources they viewed
I do this at the command line, by going through the logs and cutting out what I need:

cat ezproxy*.log* | cut -d' ' -f1,3,4,7 |  grep 'connect?session' > ezproxy.out

(This basically retrieves columns 1, 3, 4, and 7 from the log file, from each line that shows the user authenticating their session)

With the user names redacted, the output looks like:



Step Three - run it through my Python script

Details in next post

Friday 8 November 2019

Kiosks - news flashes and advertising posters

I had coded the kiosk interface as a dynamic webpage: using Bootstrap to handle the formatting, and PHP to gather up the data from outside systems (the API of PC availability; the study rooms; the occupancy total).

While doing so, I hadn't put any thought into the possibility that anyone apart from me would ever want to make changes to the content.  As such, the backend setup was very programmatic, and editable only on a web server to which few of our staff have access.

I had basically created a setup that would leave me hostage to making each and every change!

That would have been fine if we were talking about interesting developments to the floorplans, or additional API calls: but, very quickly, colleagues were asking me to post latest-news style articles at short notice.

To allow me to retain control of  the code, while letting colleagues update the page, I came up with a simple trick of a web form that allowed colleagues to enter a headline and the text of a news flash.  Whatever they typed into the form was saved to a MySQL database, from which it was inserted live into the kiosk main page - with all the appropriate formatting applied.

This proved such a quick win (in sparing me from spade work) that I adapted the web-form->MySQL method to allow colleagues to add a digital poster to the display.  This permitted our Admin team to advertise topical events and news, without needing me, and leaving the code of the page safe from accidental change.

In the picture, you can see a poster advertising some events from our Digital Discovery week.  The news flash has been temporarily removed to make space for a scrolling carousel of book jackets that link through to our catalogue during Black History Month.

Tuesday 5 November 2019

Kiosks - showing free study rooms

The next widget I added used a direct SQL query to our room booking system, and showed whether there were any bookable study rooms free during  the current hour slot:


So, within a few weeks, our navigation kiosk had increased its repertoire.

Like a science fiction concierge, it now stood at our entrance; greeting everyone with a summary of how busy the building was at that moment, and offering directions.

Kiosks part 4 - the quick wins continue - PC availability

Seeing the positive response to the occupancy counter, I swiftly added a couple of extra widgets.

The first showed how many PCs were currently free for use:






This called to a Web Service created by our central IT Services, which monitored the logins across campus in real time.  With a bit of tweaking, it was quick enough to restrict this to machines in the Library.

Because of the kiosk's touch-screen, users could click on an area with free PCs, and see the location on the floorplan (complete with pulsing marker pin).

Kiosks: part 3

With the monolithic kiosk in place, it was immediately obviously that its potential would be sorely under-exploited if we left it dedicated to displaying our floorplans.

Once term got up and running, we imagined that users would become more familiar with the building layout, and have less use for the navigational aspects of the kiosk.

I had already been doing some preliminary exploration of creating an at-a-glance webpage that would give users a feeling for how busy the Library building was in real-time.  My thought was to create  a dashboard showing the current availability of services:

  • How full was the building?
  • How many PCs were free?
  • Whether there were any available study rooms
  • How many laptops were available for borrowing?

Access to our library involves climbing up several dozen steps: so I was planning a web page that would give people a sense of whether it was worth coming over and climbing up an Aztec temple's worth of stairs.

This sort of real-time information struck me as potentially useful on the kiosk.  True - I wasn't going to save anyone the steps - but I could make it immediately clear when they walked through the front door how busy they were likely to find our services.

I started to add my experimental features to the monolith.

First of all, I added a prominent bar that showed how many people were in the building.  This was something that had been available to staff for years, and relied on a simple call to our entry management system.


A quick win for me, as it reused existing code: this feature was welcomed straight away.