Wednesday, April 23, 2014

Getting data from the BRMS log

brms backup log sql create table

BRMS (Backup, Recovery and Media Services) is used by many IBM i customers to manage the way they save data. Control Groups are created to group libraries, folders and files together. When a BRMS save is initiated it is for the Control Group, rather than the individual libraries, folders, etc.

I was recently tasked to create a daily report, that would be emailed to all IT managers, on the status of the previous night's backups. After a quick search through the BRMS menus I found the Display Log for BRM, DSPLOGBRM, command. I was disappointed to find that it will only output to display or printer. Which meant it was not suitable for my needs. I needed a file to extract the information I needed to generate the report.

After "surfing" through the files in the BRMS library, QUSRBRM, I found the BRMS log file, QA1ALG. Which I confirmed with a telephone call to IBM Support. I am not going to list all of the fields in the file, just the ones I needed:

Field Description
LGMDTA Log text
LGDATE Logged date
LGTIME Logged time
LGSEV Message severity
LGJOB Job name
LGJNBR Job number
LGMID Message id
LGCGNAM Control Group

By combining LGUSER, LGJOB, and LGJNBR I could group the records by the job that had run the Control Group.

I could now extract the data for a period from the BRMS log file into a work file. I used the method I described in the post Creating a SQL table "on the fly" to create a work file in QTEMP. Below is a snippet of the SQLRPGLE program I created to do this:

01  dcl-s wkDateTime char(13) ;
02  dcl-s wkString char(500) ;

03  wkDateTime = %char(%date() - %days(1):*cymd0) + '200000' ;

06                           LGUSER,LGJOB,LGJNBR,+
07                           CAST(LGMDTA AS CHAR(200)) AS MSGDATA,+
08                           CONCAT(DIGITS(LGDATE),DIGITS(LGTIME)) +
09                             AS DATETIME +
10                      FROM QUSRBRM/QA1ALG +
12                           >= ''' wkDateTime + ''') +
13                     WITH DATA'  ;

14  exec sql EXECUTE IMMEDIATE :wkString ;

Lines 1 and 2 show that I am starting to dabble with the new TR7 totally free RPG, we need to come up with a name for this, perhaps RPG5/RPG5? I am not going to go into much details as to what these lines do, except to say that the dcl-s is used to define stand alone fields.

On line 3 I am making a pseudo-timestamp field I can use to compare with data in the Log file. The date field, LGDATE is seven long and is in the *CYMD format (CYYMMDD). Thus I take today’s date, %date(), subtract one day, %days(1), to get yesterday’s date, convert it to *CYMD without a date seperator character, and append 200000, or 8:00 PM, to it.

Lines 4 – 13 are where I create the SQL statement I will execute later. I am not going to describe in detail the CREATE TABLE AS as I did so in the Creating a SQL table "on the fly" post. I am only describe any special things I put into the statement.

The LGMDTA field is a VARCHAR, variable length character, field that takes up 32002 characters in the Log file. I do not care for most of those characters, only the first 200. If I used substring function thus: SUBSTRING(LGMDTA,1,200), I do get the first 200 characters of the data, but it is still a VARCHAR field. By using the CAST function, on line 7, I can define a new column (field) MSGDATA that is a 200 long CHAR field, which just takes the first 200 characters of the LGMDTA.

Having created a comparison pseudo-timestamp, wkDateTime, I need to be able to compare it to the same in the SQL statement. I can use the CONCAT two columns together, even if they are numeric as LGDATE and LGTIME are. But it removes any leading zeroes, therefore, the value "022435" becomes "22435 ", that is 22435 followed by a blank. If I used the CHAR function, as in CONCAT(CHAR(LGDATE),CHAR(LGTIME)) also removed the leading zeroes. After some investigation I found that if I used the DIGITS is returned the alphanumeric version of the number with the leading zero.

Having the desired SQL statement in the field wkString I then execute it using the EXECUTE IMMEDIATE, on line 14, to create the work file.

Now I can create any indexes I need to sort the data contained within the work file. For example:

        ON QTEMP/@WORK
           RCDFMT INDEX1 ;

For more information on building indexes read the Add key to SQL table in QTEMP post.

When I look in the created work file I use the Message id, LGMID, column to determine if there was an error or not. The following BRM messages are the ones I found most useful:

Message id Description
BRM1380 Processing started
BRM1049 Completed without errors
BRM10A1 Completed with errors
BRM1657 Cancelled
BRM1970 Completed abnormally
BRM1046 Power down and IPL started before save completed

You can get the description for any message id using the following commands:

BRM messages, for example BRM1970:


"CPF" messages, for example CPC3701:


WARNING: IBM Support did warn me that BRMS file structures will be changing with the next release, 7.2. I was told to check the “What is new in this Release" when 7.2 is released to see what changes would need to be made.


You can learn more about these on the IBM website:


This article was written for IBM i 7.1, and it should work with earlier releases too.


  1. Many thanks for the info. I have created a daily report for every LPAR. It informs us every morning about the result of the backup by sending an email to our general iSeries mailbox. I also want to use the generated workfile to send an alert when the backup was ended in error.

  2. Rudi van HelvoirtMay 1, 2014 at 9:09 AM


    In BRMS it is possible to e-mail every single item out of the BRMS log, all you need is the BRMS Network Feature (option 1 of 5770BR1). This treasure is available in iNav and Navigator for i. In the green screen this function is not available.
    BRMS in the GUI is the way to go!!!

    Greetings Rudi

  3. Rudi van HelvoirtMay 5, 2014 at 3:04 PM


    If you have the Advanced Job Scheduler (5770-JS1) also on your system, you are able to automate the sending of the recovery reports to an e-mail address after each backup is run. In case of a system recovery you have all the information you need available. Which are:
    1. The amount of disk space needed
    2. The tape volume ID's
    3. The recovery steps
    Again all this can be done from iNav and Navigator for i. with no programming involved.

    Greetings Rudi

  4. Sir,

    Just to let everyone know, this will not work with V6R1 until you remove column LGCGNAM. It was added in V7R1.


  5. Rudi, could you show us a method to do this. I'm looking around and I can use Client Access to sned messages to myself.

  6. I had to cast to CCSID 37 to see the value translated as string

  7. QA1ALG is a log of what happened in a BRMS job run.

    I have never check to see if the same detailed information in HISTORY_LOG_INFO. Even if it is it would be a lot of unnecessary processing to game the same information from it.

    I am not sure if DISPLAY_JOURNAL would contain information about which objects were saved, and more importantly which were not. What about IFS objects too?

  8. No SQL "command" equivalent.
    DSPLOGBRM does have a table, it extracts the information from: QUSRBRM/QA1ALG

  9. Hi,
    Your blog has helped in many ways,
    As dsplogbrm has period function DSPLOGBRM period(start_time,start date),
    why doesn't SQL provide a way to sort QUSRBRM/QA1ALG using timestamp?.
    Is there no way to get the logs like this->"SELECT * FROM TABLE( QUSRBRM/QA1ALG(starting_timestamp=>'2020-06-16 09:57:40.867520',ending_timestamp=>'2020-06-17 09:57:40.867520')).
    ANY alternative you can provide to directly access the brms logs through sql programming

    1. Depending upon the release of IBM i the date could be a number in CYMD format.

    2. so SELECT * FROM TABLE(QUSRBRM/QA1ALG('CYMD')) must work right?

      RIGHT('000000' || TRIM(CHAR(LGTIME)),6)
      BETWEEN '1200425000000' AND '1200425010106'


To prevent "comment spam" all comments are moderated.
Learn about this website's comments policy here.

Some people have reported that they cannot post a comment using certain computers and browsers. If this is you feel free to use the Contact Form to send me the comment and I will post it for you, please include the title of the post so I know which one to post the comment to.