Pages

Wednesday, December 14, 2016

Searching the History log using SQL

history_log_info better way to get entries from the history log

I am sure I am not the only person who has received a message, email, or telephone call informing me that there was an error message that happened earlier. When I reply asking for more details it is not uncommon to be told: "I don't know what the error was, I just replied with a C". This, of course, tells me nothing useful to be able to make a diagnosis what and where the error happened. Often it drives me to the History log which I have to search through looking for an error message. Narrowing the possible time frame makes the searching easier, but if the best the message sender can tell me is "earlier this afternoon" or "during the day end run" I am left with a large amount of log data to search through.

IBM has come to my rescue with a DB2 of i (SQL) table function that allows me to search the History log and select the data I want to see. I am not sure if this is part of the IBM i 7.3 TR1 and 7.2 TR5 updates. You can search to see if it is present on your IBM i using the PTF information View, PTF_INFO, to look for relevant PTF:

  • IBM i 7.3 SF99703 Level 3
  • IBM i 7.2 SF99702 Level 14

The HISTORY_LOG_INFO table function when used in its simplest form gives all the same information that looking at the History log does.

SELECT * FROM TABLE(QSYS2.HISTORY_LOG_INFO()) A

Which in my opinion is overwhelming as the History log output.

I can narrow the date and time range, as I can with the History log by using the Table function's two parameters. The parameters are timestamps.

01 SELECT * 
     FROM TABLE(QSYS2.HISTORY_LOG_INFO(CURRENT TIMESTAMP - 1 DAY)) A

02  SELECT * 
      FROM TABLE(QSYS2.HISTORY_LOG_INFO('2016-11-30-06.00.00.000000',
                                   '2016-11-30-12.00.00.000000')) A

Line 1: As only one parameter is used it is assumed that this is the from timestamp. In this case I want to start one day before the current time.

Line 2: I have given two parameters, they must be separated by a comma ( , ).

While this reduces the amount of data I have retrieved I am still left with a lot of log entries I don't care about, for example, successful completion of jobs, etc.

As this is a Table function I can use a SQL Select command to refine the search criteria I use to find the data I want. Before I give examples let me list what I consider to be the useful columns I will be using.

Column name Description
ORDINAL_POSITION Unique number of the entry. The older the entry the lower the number.
MESSAGE_ID Message id.
MESSAGE_TYPE Type of the message. I am not going to list them all here, if you want to know all the possible values see the link provided at the bottom of this post.
SEVERITY Severity of the message.
MESSAGE_TIMESTAMP The timestamp when the message was sent to the History log.
FROM_USER User of the job.
FROM_JOB Full job name.
MESSAGE_TEXT First level message text. This is in a VARGRAPHIC column in CCSID 1200 so I will need to CAST it to make it readable on my IBM i.
MESSAGE_SECOND_LEVEL_TEXT Second level message text. This is in a VARGRAPHIC column in CCSID 1200 so I will need to CAST it to make it readable on my IBM i.

There are other columns, but as I am not going to use them I will refer you to the link at the bottom of this post.

With these columns I can create a Select statement to list all errors in my desired date range.

01  SELECT MESSAGE_ID,
02         MESSAGE_TYPE AS MESSAGE_TYPE,
03         SEVERITY,
04         MESSAGE_TIMESTAMP AS TIMESTAMP,
05         FROM_USER,
06         FROM_JOB,
07         CAST(MESSAGE_TEXT AS CHAR(200) CCSID 37)
               AS MESSAGE_TEXT,
08         CAST(MESSAGE_SECOND_LEVEL_TEXT AS CHAR(1000) CCSID 37)
               AS MESSAGE_TEXT_2
09    FROM TABLE(QSYS2.HISTORY_LOG_INFO('2016-12-03-16.00.00.000000',
                                   '2016-12-03-16.15.00.000000')) A
10   WHERE SEVERITY >= 30
11   ORDER BY ORDINAL_POSITION

I am using the AS, on lines 2, 4, 7, and 8 to give the column labels some more meaningful label. For example, "MESSAGE_TYPE" is easier to understand than "MESSA00001".

Lines 7 and 8: I use the CAST to convert these columns into something I can read.

Line 9: I am giving the date and time range I desire as parameters in the Table function.

Line 10: I only want to know about any log entry with a severity of 30 or greater.

Line 11: I want to sort by oldest first.

This gives me (I have broken it out into separate lines just to display it here).


MESSAGE_ID  MESSAGE_TYPE SEVERITY  TIMESTAMP                   FROM_USER  FROM_JOB
 RNQ1216    INQUIRY           99   2016-12-03-16.09.54.579968  SIMON      026063/SIMON/TESTRPG   
 RNQ1216    INQUIRY           99   2016-12-03-16.10.02.045824  SIMON      026063/SIMON/TESTRPG   
 -          REPLY             99   2016-12-03-16.10.02.099072  SIMON      026052/SIMON/QPADEV0013



MESSAGE_TEXT
Error message CPF4131 appeared during OPEN for file TESTDSPF (C S D F).                
Error message CPF4131 appeared during OPEN for file TESTDSPF (C S D F).                
C                                                                                      


MESSAGE_TEXT_2
&N Cause . . . . . : RPG procedure TESTRPG in program MYLIB/TESTRPG received the message CPF4131
&N Cause . . . . . : RPG procedure TESTRPG in program MYLIB/TESTRPG received the message CPF4131
-


while performing an implicit OPEN operation on file TESTDSPF. The actual file is TESTDSPF.
while performing an implicit OPEN operation on file TESTDSPF. The actual file is TESTDSPF.

The equivalent using the DSPLOG command is.

DSPLOG PERIOD((161000 120316) (161500 120316)) OUTPUT(*PRINT)

I would then have to search the generated spool file to discover:


MSGID    SEV MSG TYPE
RNQ1216  99  INQUIRY      Error message CPF4131 appeared during OPEN for file TESTDSPF (C S D F).
                      TESTRPG    SIMON     026063 QMHSCLVL     0000 16-12-03 16:10:02.913396 SIMON
         99  VALID RPY    C
                      QPADEV0013 SIMON     026052 QMHSCLVL     0000 16-12-03 16:10:02.913409 SIMON

In my opinion using the HISTORY_LOG_INFO gives me what I want a lot quicker and easier than searching through the output from the DSPLOG command. I for one will be using these Table function the next time I get one of those "There was an error and I don't know when" communications.

 

You can learn more about the HISTORY_LOG_INFO Table function from the IBM website here.

 

This article was written for IBM i 7.3 TR1 and should work with 7.2 TR5 too.

19 comments:

  1. Looking forward to it, since I am Rocking 7.1 right now and I just did a search and we do not have that function...

    ReplyDelete
  2. Thank you so much for the article. In terms of logs is there any way to retrieve in 7.1 a Log for a job that is gone?
    Thanks.

    ReplyDelete
  3. Long story short, No. If QPDSPJOB spooled file for the job is no longer in the system or in FIN status, the job is gone.

    ReplyDelete
  4. Another reason to move on from v7r1. I like the ease for looking at log.

    ReplyDelete
  5. I have a question, since my present shop does not have 7.3. What would be the impact of temporal tables on journal files. I see the importance of journaling being diminished. Much of the data might just be traced instead of loading journal files.

    ReplyDelete
    Replies
    1. You cannot roll back with temporal tables.

      But the ability to run multiple jobs look at different dates and times of the data is mind blowing!

      Delete
  6. Simon
    I just used this process to locate IP addresses used by some ODBC users. It works great. I integrated it into a CGI program using CGIDEV2. Now I have a web display. I key the user id and press enter. A display of the IP addresses from the CPIAD09 messages appears. This will save me tons of time.
    Thanks
    Steve Bowdoin

    ReplyDelete
  7. That is not a question I can answer. It all depends on how far back your system admin keeps the history logs.

    ReplyDelete
  8. The date in timestamps is always returned in *ISO format.

    I noticed the difference in the formatting of timestamp, as you described, when I started using ACS. It would appear to be something in ACS. If I look at the same column and row in a RPG program I see it formatted in the way I expected.

    ReplyDelete
  9. It depends on the settings of your ACS and the IBM i you are using.

    I am in the USA. When I run the following statement:
    SELECT CURRENT_TIMESTAMP FROM SYSIBM.SYSDUMMY1

    I get a slightly different format depending upon which tool I am using.

    ACS gives me : YYYY-MM-DD HH:MM:SS.MMMMM

    STRSQL gives me : YYYY-MM-DD-HH.MM.SS.MMMMM

    Should you be worried about this. In my opinion no. They are both representations of the value of the timestamp, not the actual value.

    ReplyDelete
  10. I didn't know about this one, but it is going to come in real handy. Thanks for sharing.

    ReplyDelete
  11. thanks very useful info

    ReplyDelete
  12. Simon, thanks for sharing, it’s a great read . I like using views, they are great.. Again, thanks..

    ReplyDelete
  13. Hi Simon, Are there any sentence that permits me to see the command to be executed when a job is in jobq yet?

    ReplyDelete
    Replies
    1. Rather a strange question to ask.

      No, the operating system has no idea of what commands will be executed by a job in the future.

      All you can do is look at the program(s) that the job will be executing to see commands are in them.

      Delete
  14. These SQL gems are are awesome. .

    ReplyDelete
  15. Hi Simon, great info as always! But when I run the SQL (the first example here and one I found from IBM) it is only returning results from two days ago and sooner. When I run: DSPOBJD OBJ(QSYS/QHST*) OBJTYPE(*FILE) it shows I have log files back two weeks, but I can't get those results in SQL. Is there some limitation to the SQL results or am I doing something wrong?

    ReplyDelete
    Replies
    1. That is interesting. The only time I have encountered something similar is when someone had deleted one of the QHST files, so that there was a "gap". HISTORY_LOG_INFO could not return the data from before the "gap".

      Delete

To prevent "comment spam" all comments are moderated.
Learn about this website's comments policy here.

Some people have reported that they cannot post a comment using certain computers and browsers. If this is you feel free to use the Contact Form to send me the comment and I will post it for you, please include the title of the post so I know which one to post the comment to.