Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Deck
idKuali OLE
Card
labelOverview
Section
 
Column
width50%
 

Description

Delete bibliographic, instance, and item data from OLE as a batch process, utilizing an imported file (e.g. a MARC file from a vendor, a text file of record IDs) to identify the records.

 

Files of records or record IDs often come from vendors but can also be created in the library.  A vendor may decide to remove access to one or more titles that has been accessible through a subscribed collection, a library might decide to cancel a subscription to an aggregate of titles.  Since the library’s access to such titles has been terminated, there is a need to remove their reference in the library database.

 

Vendors may supply a binary MARC file that includes records for the titles that are no longer accessible.  Or a text file that simply includes record IDs may be used.  Ideally, a library can use such files to search their local database for records that match those in the file and the records in the database can then be deleted.

Card
labelDesign: Business

User Stories

Priority Rank Order

 

Card
labelDesign: Technical

 

Design Technical
Service Design and Implementation
Screen Flow Diagram

Layout use cases based on the screen flows indicating navigation between different screens and pop-ups. 

Security
  • Roles – list of roles to be defined
  • Permissions – list of permissions to be defined
  • User Profiles – user profiles that may be applied roles/permissions (also for test/mock purposes) 
Rules
  • Context & Agendas
  • Terms & Prepositions
  • Sample XML file
Component Design
  • OJB.xml
  • Class Diagram

 

  • State Chart Diagram
  • Psuedo Code
  • User creates a batch profile using the options provided in batch profile screen
  • User schedules a batch job for a given data and time
  • BatchProcessJobManager will perform the job scheduling by calling scheduleJob(ScheduleJobBo bo) method. The schedule information (storeds as cron expression) and the job information is stored in db
  • The Quartz scheduler which is configured in KRMSLocalSpringBean.xml calls the BatchProcessManager.execute(int processId) method.
  • BatchProcessManager will load the profile(BatchProcessProfile) created by user based on the process id supplied and call the factory class to create BatchProcess instance.
  • The BatchProcessFactory creates the appropriate BatchProcess instance based on the processType in the profile
  • The processBatch(BatchProcessProfile profile) is called to perform the batch process as per the profile information provided by the user.
    1. BatchProcessExportData
      This batchprocess is involved in performing export of bib / intance information based on the profile provided by the user. It has the following methods:
      1. batchExport(BatchProcessProfile profile) 

      This method is passed the BatchProcessProfile and writes its output to the filesystem. It is the method used for scheduled exports. It should function as follows

     

    1. A call is made to the getBatchJobInfo(int processId) to get the current job info which executed the processBatch()
    2. updateBatchProgress(BatchProcessJobBo jobBo) is called to updated the job running status to started and start time.
    3. A call is made to the batchExport(BatchProcessProfile profile) method passing the profile information.
    4. The needed values are pulled out of the profile information.
    5. exportFilterCriteria - What exportFilterCriteria to use. The exportFormat is read from the exportFilterCriteria
      1. exportType - Full or Incremental
      2. exportId (processId)- Used for saving the incremental dateTime.
      3. exportFileTo - The directory and file name format.
      4. writeReportTo - The directory and report file name.
      5. chunkSize - Max number of records per file.
      6. staffOnly- If staffOnly records should be included.
      7. updateLeveInfo - bibOnly, instance.
    6. If the export is incremental 
      1. Get the last export dateTime for the exportId
      2. save the current dateTime exportId pair.
    7. call getSolrDocList(BatchProcessProfile profile) to get the solr doc list
    8. A call to updateBatchProgress(BatchProcessJobBo jobBo) is called to updated the total number of records to be processed.
    9. if there is more than one chunk split the solr results based on the number of chunks data
    10. make a call to ExportService.getExportDataBySolr using the exportFormatProfile and one set of solr results which were split as per the chunk size.
    11. If isMarc21 is true convert the returned MarcXML to Marc21.
    12. Write the returned XML or Marc21 from the export service to files using the exportTo variable.
    13. call the updateBatchProgress(BatchProcessJobBo jobBo) passing the jobBo which contains the current job information updated with records processed, % processed and time spent.
    14. If there is more chunks make the next call to the ExportService.
    15. Create the Report in the location provided in the profile.
    16. updateBatchProgress(BatchProcessJobBo jobBo) is called to updated the job running status to completed and end time, total records processed, % processed and time spent.

     

     

    Service Contracts
    Service Implementation
    Card
    labelProcess Diagram
    Section
     
    Column
    width50%
     

    Class Diagram:

     

     

    Card
    labelUI Screen

     UI Screen

     

    Card
    labelData Model

    ER Diagram: Batch Process Profile

     

    Batch Process

    Data Model

    Entity : BATCH DEFINITION

     

    Name

    Max
    Len

    Data Type

    Default Value

    Input Type

    Action Event

    Action Event Function Name

    Validation
    Type

    Tab Index

    Remarks