Chapter 24. Migrating Patron Data

Table of Contents

Creating an sql Script for Importing Patrons
Batch Updating Patron Data

This section will explain the task of migrating your patron data from comma delimited files into Evergreen. It does not deal with the process of exporting from the non-Evergreen system since this process may vary depending on where you are extracting your patron records. Patron could come from an ILS or it could come from a student database in the case of academic records.

When importing records into Evergreen you will need to populate 3 tables in your Evergreen database:

Before following the procedures below to import patron data into Evergreen, it is a good idea to examine the fields in these tables in order to decide on a strategy for data to include in your import. It is important to understand the data types and constraints on each field.

  1. Export the patron data from your existing ILS or from another source into a comma delimited file. The comma delimited file used for importing the records should use Unicode (UTF8) character encoding.
  2. Create a staging table. A staging table will allow you to tweak the data before importing. Here is an example sql statement:

        CREATE TABLE students (
             student_id int, barcode text, last_name text, first_name text, email text,
                     address_type text, street1 text, street2 text,
            city text, province text, country text, postal_code text, phone text, profile
            int DEFAULT 2, ident_type int, home_ou int, claims_returned_count int DEFAULT
                    0, usrname text, net_access_level int DEFAULT 2, password text
        );

    Note

    The default variables allow you to set default for your library or to populate required fields in Evergreen if your data includes NULL values.

    The data field profile in the above SQL script refers to the user group and should be an integer referencing the id field in permission.grp_tree. Setting this value will affect the permissions for the user. See the values in permission.grp_tree for possibilities.

    ident_type is the identification type used for identifying users. This is a integer value referencing config.identification_type and should match the id values of that table. The default values are 1 for Drivers License, 2 for SSN or 3 for other.

    home_ou is the home organizational unit for the user. This value needs to match the corresponding id in the actor.org_unit table.

  3. Copy records into staging table from a comma delimited file.

        COPY students (student_id, last_name, first_name, email, address_type, street1, street2,
            city, province, country, postal_code, phone)
            FROM '/home/opensrf/patrons.csv'
                    WITH CSV HEADER;

    The script will vary depending on the format of your patron load file (patrons.csv).

  4. Formatting of some fields to fit Evergreen filed formatting may be required. Here is an example of sql to adjust phone numbers in the staging table to fit the evergreen field:

        UPDATE students phone = replace(replace(replace(rpad(substring(phone from 1 for 9), 10, '-') ||
        substring(phone from 10), '(', ''), ')', ''), ' ', '-');

    Data “massaging” will be required to fit formats used in Evergreen.

  5. Insert records from the staging table into the actor.usr Evergreen table:

         INSERT INTO actor.usr (
            profile, usrname, email, passwd, ident_type, ident_value, first_given_name,
            family_name, day_phone, home_ou, claims_returned_count, net_access_level)
            SELECT profile, students.usrname, email, password, ident_type, student_id,
            first_name, last_name, phone, home_ou, claims_returned_count, net_access_level
            FROM students;
  6. Insert records into actor.card from actor.usr .

        INSERT INTO actor.card (usr, barcode)
            SELECT actor.usr.id, students.barcode
            FROM students
                    INNER JOIN actor.usr
                            ON students.usrname = actor.usr.usrname;

    This assumes a one to one card patron relationship. If your patron data import has multiple cards assigned to one patron more complex import scripts may be required which look for inactive or active flags.

  7. Update actor.usr.card field with actor.card.id to associate active card with the user:

        UPDATE actor.usr
            SET card = actor.card.id
            FROM actor.card
            WHERE actor.card.usr = actor.usr.id;
  8. Insert records into actor.usr_address to add address information for users:

        INSERT INTO actor.usr_address (usr, street1, street2, city, state, country, post_code)
            SELECT actor.usr.id, students.street1, students.street2, students.city, students.province,
            students.country, students.postal_code
            FROM students
            INNER JOIN actor.usr ON students.usrname = actor.usr.usrname;
  9. Update actor.usr.address with address id from address table.
    UPDATE actor.usr
        SET mailing_address = actor.usr_address.id, billing_address = actor.usr_address.id
        FROM actor.usr_address
        WHERE actor.usr.id = actor.usr_address.usr;

This assumes 1 address per patron. More complex scenarios may require more sophisticated SQL.

Creating an sql Script for Importing Patrons

The procedure for importing patron can be automated with the help of an sql script. Follow these steps to create an import script:

  1. Create an new file and name it import.sql
  2. Edit the file to look similar to this:
    BEGIN;

    -- Create staging table.
    CREATE TABLE students (
        student_id int, barcode text, last_name text, first_name text, email text, address_type text,
                street1 text, street2 text, city text, province text, country text, postal_code text, phone
                text, profile int, ident_type int, home_ou int, claims_returned_count int DEFAULT 0, usrname text,
        net_access_level int DEFAULT 2, password text
    );

    --Copy records from your import text file
    COPY students (student_id, last_name, first_name, email, address_type, street1, street2, city, province,
        country, postal_code, phone, password)
        FROM '/home/opensrf/patrons.csv' WITH CSV HEADER;


    --Insert records from the staging table into the actor.usr table.
    INSERT INTO actor.usr (
        profile, usrname, email, passwd, ident_type, ident_value, first_given_name, family_name,
        day_phone, home_ou, claims_returned_count, net_access_level)
        SELECT profile, students.usrname, email, password, ident_type, student_id, first_name,
        last_name, phone, home_ou, claims_returned_count, net_access_level FROM students;

    --Insert records from the staging table into the actor.usr table.
    INSERT INTO actor.card (usr, barcode)
        SELECT actor.usr.id, students.barcode
        FROM students
                INNER JOIN actor.usr
                        ON students.usrname = actor.usr.usrname;

    --Update actor.usr.card field with actor.card.id to associate active card with the user:
    UPDATE actor.usr
        SET card = actor.card.id
        FROM actor.card
        WHERE actor.card.usr = actor.usr.id;

    --INSERT records INTO actor.usr_address from staging table.
    INSERT INTO actor.usr_address (usr, street1, street2, city, state, country, post_code)
        SELECT actor.usr.id, students.street1, students.street2, students.city, students.province,
        students.country, students.postal_code
        FROM students
        INNER JOIN actor.usr ON students.usrname = actor.usr.usrname;


   --Update actor.usr mailing address with id from actor.usr_address table.:
    UPDATE actor.usr
        SET mailing_address = actor.usr_address.id, billing_address = actor.usr_address.id
        FROM actor.usr_address
        WHERE actor.usr.id = actor.usr_address.usr;

    COMMIT;

Placing the sql statements between BEGIN; and COMMIT; creates a transaction block so that if any sql statements fail, the entire process is canceled and the database is rolled back to its original state. Lines beginning with — are comments to let you you what each sql statement is doing and are not processed.