Thursday, April 27, 2023

COMMON Finland

Tervetuloa, COMMON Finland have a new website, at commonfinland.fi

They become the 18th national association affiliated with COMMON Europe to launch their website.

I have added their site to my IBM i User Groups page. There is a link at the top of every page and post to this page.

If you are a member of, or know of, any User Groups not listed on my IBM i User Groups page please let me know of it, using the Contact Form that you can find on every post and page.

Wednesday, April 26, 2023

New columns added to HISTORY_LOG_INFO

As part of the latest round of Technology Refreshes, IBM i 7.5 TR1 and 7.4 TR7, three new columns were added to one of my favorites Db2 for i Table Functions, HISTORY_LOG_INFO. I often use this Table Function to search the IBM i partitions' history.

The new columns are all parts of the existing job name column, FROM_JOB:

  • FROM_JOB_NAME:  Job name
  • FROM_JOB_USER:  The user profile of the job
  • FROM_JOB_NUMBER:  Job number

These columns might not sound like a big addition to HISTORY_LOG_INFO, but the first two, name and user, make it easier to find results I am looking for.

Wednesday, April 19, 2023

Personalised SQL error logging, SELF

Update (January 24, 2023)
I can now use the special values '*ERROR', '*WARN', or '*ALL' rather than having to list all the SQL codes. Read about it here.


Introduced as part of IBM i 7.5 Technology Refresh 1 and IBM i 7.4 TR7, is a mechanism to capture details of SQL errors into a separate log table. I can decide which errors I want to capture in the log, by use of the SQL code. The SQL codes to capture are set at the SQL session level, rather than at the IBM i job level.

This is called the SQL Error Logging Facility, or SELF for short. It consists of several parts, the parts I am going to explain in detail are:

  • SQL_ERROR_LOG:  A View that is used to display the logged errors
  • SELFCODES:  A Global Variable that needs to contain the SQL codes I wish to log
  • VALIDATE_SELF:  A scalar function that validates SQL codes

Monday, April 17, 2023

ACS 1.1.9.2 now available

Update August 17, 2023: New version is currently available for download here


As promised in the latest round of Technology Refreshes a new release of Access Client Solutions, ACS 1.1.9.2, is now available.

I install the update in two ways:

From ACS

Notice:  As of April 27 this method is still not working. Use the second method described below.

  • Open your ACS window
  • Select "Help" on the menu at the top of the window
  • Select "Check for updates" in the drop down menu

You will see the following window:

Wednesday, April 12, 2023

More about IBM i 7.5 TR2 and 7.4 TR8

This is what I have found since yesterday about the new Technology Refreshes, IBM i 7.5 TR2 and IBM i 7.4 TR8.

Then new release of ACS is now available. You will find instructions about how to download it here.

Tuesday, April 11, 2023

New TRs, IBM i 7.5 TR2 and 7.4 TR8, announced

The Spring 2023 Technology Refreshes, IBM i 7.5 TR2 and IBM i 7.4 TR8, have been announced today.

The availability dates for the PTFs are:

  • Base TR PTFs May 5, 2023
  • Db2 (SQL) PTFs May 19, 2023
  • RPG PTFs included in the Db2 fix pack

All the information about these TRs can be found:

Additions and changes to the RPG programming language are:

Wednesday, April 5, 2023

Easy way in SQL to insert records from one file that are not in the other

In this scenario there are two files with identical field names, and they have the same data types too. I was asked if there is an easy way, using SQL, to insert all records from one file into a second file, omitting records that match ones that are in the second file.

I wanted to come up a solution where I did not have to give any field/column names in the statement. For all I knew the files in questions had many, many field names.

I created a file, I called FILE1, with four fields. Then I used the Create Duplicate Object command, CRTDUPOBJ, to create a duplicate, which I called FILE2.

FILE1 contained four records, which I can show using the following SQL statement: