Moving to Oracle HCM? How will you archive your Lawson data?

It’s official, you have finalized the project to move to Oracle HCM and are ready to begin your journey to the cloud. But wait! What will you do with all your Lawson legacy data? By now you probably know that you can’t take it all to Oracle HCM. In fact in most cases you’re limited to a small subset of your original data. Your data retention policies may not allow you to turn off your Lawson 10x (or 9.0.1) servers just yet. Looks like you have more work to do.

Keeping those servers around for another 7 years is not an option. You’re likely running several Windows 2012 servers, or worst yet, Windows 2008, AIX or AS400 iSeries. Not to mention the database servers, file servers, co-location, Disaster Recovery servers and more. It’s enough to keep the most seasoned IT manager awake at night. What if the hardware fails? What if the old OS has a vulnerability we can’t patch? What if our server admin who knows how to maintain Lawson leaves? All these what-ifs are just the tip of the data retention iceberg.
One option is to park the data in a data lake and bury it under a rug for the time being. But unless your users are data analysts with some sophisticated BI tools and skills, then the data lake might as well be a Crater laker.
What if there was a way to completely remove the Lawson footprint from your data center but still provide fast, secure, and intuitive access to all the data to your users in a way that didn’t require any servers or maintenance? The APIX Serverless Framework is just that solution. Based on the AWS serverless stack, APIX has opened up incredible possibilities for inquiry only applications never before possible without a substantial investment in infrastructure. Clients can now provision a web based, lightweight, data archive solution and migrate all their data within days rather than months and at a fraction of the cost of the other solutions with none of the risk. Find out how the APIX serverless framework can help you meet all your Lawson data archive needs and eliminate the legacy servers for good.

CBCHECK vs. CBTRANS

All AP payments hit the CBCHECK file.  If your payment code (CB00.4) is set up to post in summary mode, (CB00.7 – under the Cash Payment button) the detailed APPAYMENT information that matches your APPAYMENT file will only be found in the CBCHECK file.  There is a serial number on both the CBCHECK  and the CBTRANS file.  That is how you can link the two CB files for those transactions that are booked In summary. The total transaction amount will appear in the CBTRANS table with the serial number (with leading fill zeros) to be able to get to the CBCHECK detail for that total.

Archiving Lawson Data

When it comes to Legacy applications, data retention policies rule the day. Whether you’re moving to the newest cloud solution or simply upgrading, the option of keeping all of your data is likely not available to you. How do you choose what you keep around and what to purge? And what happens to the data you keep? How will you get access to it? How will you secure it? How long do you keep it around? Who will pay for it? Where will it be stored? Who will be responsible for maintaining it?

One option is to keep the old system around for inquiry purposes only. While this option seems like the simplest solution, it can actually prove to be the most costly and complicated. Here are some ways keeping an old system like Lawson 9 or Lawson 10x can backfire down the line:
  •  Old server hardware becomes more unreliable each day putting you at greater risk for losing your data
  • Older, unsupported operating systems like, iSeries AS400, AIX, Windows 2008 and Windows 2012 server  are no longer supported by their respective vendors and pose an increasing security risk to your entire data center.
  • Additional components necessary to keep the application running will keep your organization from moving forward. Having to keep older domain controllers, LDAP servers, web servers and database servers around simply to upkeep an already outdated application around brings with it immeasurable costs and risks.
  • As employees leave the organization and new talent arrives, it becomes increasingly difficult to give and get access to the older applications and train new users on how to access them. This often necessitates the need to retain professional services that can cost thousands of dollars and may not be able to help in the worst case scenarios.
Another option is to dump all the data into a data lake and worry about it all later. While this option sounds quick and simple, it fails the most basic test for a data archive solution; accessibility. Users will require access to the archive data and simply pointing them to a data lake does not provide realistic access. Oftentimes, clients end up spending hundreds of thousands of dollars on BI applications and custom analytics in order to extract the data they need.
Luckily, There is a third solution. The APIX serverless framework, based on the AWS serverless stack has opened up incredible possibilities for inquiry only applications never before possible without a substantial investment in infrastructure. Clients can now provision a web based, lightweight, data archive solution and migrate all their data within days rather than months and at a fraction of the cost of the other solutions with none of the risk. Find out how the APIX serverless framework can help you meet all your Lawson data archive needs and eliminate the legacy servers for good.

AD FS Authorized Token Error with InforOS

After a user attempts to log into InforOS using AD FS, if they get the error below “An error occurred”, your first stop should always be the event viewer on the AD FS server.  AD FS errors are logged in the Windows application logs area of the Event Viewer.

If you see the message “The Federation Service could not authorize token issuance for caller <username>”, this means that there is no claims rule on the relying party trust that allows this user to authenticate.

Open the AD FS Manager and go to Relying Party Trusts.  Click “Edit Claim Rules” then “Add Rule”.  Here, you can add an advanced custom rule, but the most common solution is to add the rule “Permit All Users” so that all AD users will have access to this Relying Party Trust, and their final authorization will be completed by Infor Federation Services (IFS).

Adding Lawson Application in InforOS

To add your Lawson S3 Application to InforOS, log into the InforOS portal.  Go to the management menu (the little person at the top right), and select Admin Settings.

Click Add Application on the right side of the screen.

Select “Infor Application” for the Application Type.

Select your Lawson version.

Click the Choose Icon button to choose an Icon for your site.

Enter a descriptive Display Name.

The logical ID will auto populate, but you’ll need to append a unique string to the end of it (such as “test” or “prod”).

Enter the hostname for your Lawson application (server.domain.com).

Enter the port.

Leave the context as the default Lawson.

Leave tenant ID blank (unless your tenant id is not the default).

Now a link to your Lawson Application will appear in the Homepages grid in InforOS!

 

GL199 Error Company Being Processed in Different Job

If you are running GL199 to close the period, you may encounter the below error message in the print file (or in the job scheduler log) “Company being processed in different job”.

This means that Lawson “thinks” there is another GL199 job running that is processing the same company your job is trying to process.  If that is the case, allow the first job to finish before running yours.  However, there may not be any other jobs running, yet you are still encountering the error.

To troubleshoot, first try to trace back to the first time the GL199 ran and failed (it may be in recovery or canceled).  Look at the examine log for that run.  There may be a different error that indicates the root cause of the issue.  If so, resolve that issue and recover or rerun that job.

If there are no issues to resolve, and you are certain that the GL199 hasn’t failed in the middle of updating records, you can get your DBA to update the status flag so that Lawson will no longer think the GL199 is running.  This can be accomplished by creating a paint screen for GLSYSTEM, or by making a direct update in the database.  The field “UPDATING” needs to be set to 0.  Additionally, you should check to see if your failed GL199 job is stuck in GLMONITOR.  If so, you should delete the record for that job.  But make sure you are deleting the record for your job!!!

DME Queries in Lawson

Lawson provides the ability to query Lawson tables via URL.  These transactions are called “DME” queries, and can be quite useful in IPA processes, or in scripting bulk data calls.

To query data using a DME call, the URL is https://<server name>/servlet/Router/Data/Erp?.  To select specific fields, use the “FIELD” key word.  To filter the results, use the “SELECT” key word.  You will need to have knowledge of the data tables and columns to build these queries.  The dbdef command in LID can be quite useful for this.

GEN

To query GEN data, the syntax would be https://<servername>/servlet/Router/Data/Erp?PROD=GEN

For example, this URL would return the PRODUCTLINE and FILENAME fields for the FILEDEF record with prefix “API”:

https://lawson.company.com/servlet/Router/Data/ERP?PROD=GEN&FILE=FILEDEF&SELECT=PREFIX=API&FIELD=PRODUCTLINE,FILENAME

LOGAN

To query LOGAN data, the syntax would be https://<servername>/servlet/Router/Data/Erp?PROD=LOGAN

For example, this URL would return the VERSION data in the LOGAN data area

https://lawson.company.com/servlet/Router/Data/ERP?PROD=LOGAN&FILE=VERSION

Data Area

To query your desired data area, the syntax would be https://<servername>/servlet/Router/Data/Erp?PROD=<data area>

For example, this URL would return the CUCODE “USD” from the CUCODES table in the TEST data area.

https://lawson.company.com/servlet/Router/Data/ERP?PROD=TEST&FILE=SELECT=CURRENCY-CODE=USD&FIELD=CURRENCY-CODE,DESCRIPTION

Output

The default output type is XML, but you can also output in CSV format.  To do that, in the URL, append the command “&out=csv”.