- If you need to run a batch command in Lawson, first we need to open up Lawson interface desktop (LID).
- Type the tokendef
- For this example we’ll select Environment Form IDs and then select SECURITY category.
- Place your cursor at the top of the Environment Form ID list and press F8 to insert a command
- Type in the command you want here, ours is customcomm
- In Lawson Security Administrator (LSA), go to ENV profile, open a security class you want to assign customcomm command to. Under Environment Security, you should see your new batch command, select and grant all access.
- And we’re done!
While this error itself may be obvious, you’ve already taken the steps to run the job under a different user with the same parameters and everything seems to be working. Sound familiar? Let’s dive deeper to see if this issue is related to yours.
- Go to the Lawson batch job form for the user that is having issues.
- Inquire on the job and verify that you do NOT get an “Unknown” user
- If you’re seeing an “Unknown” at the bottom of the browser bar, continue, else you likely do not have the same issue.
- Open Lawson Interface Desktop (LID) and run command: listusermap -n
- In LID, run command: listusermap -a
- This should generate a new list of user identities again.
- Clear your server cache and IOS cache, logout and back in and your issue be resolved.
Often you will have IPA processes that run Lawson batch jobs, or perform some task with the output from a batch job. There may come a time when you need to programmatically convert the output of those jobs to another format, such as PDF or CSV. The bldxffiles command is a Lawson system command that allows you to do just that. You can use a System Command node in IPA to run the command against batch job output and convert the file. (This even works in cloud, but one caveat to note is that mutli-tenant environments do now allow system command nodes in IPA).
Command syntax is:
bldxffiles –[ALCPRTX | U] [abcdpstv] <username> <jobname>
|-A||Generate the ADO files (_ado.xml, .csv and _ado_schema.xml). These are used by the OLE DB server|
|-L||Generate partial CSV files starting from a point in the report. Use in conjunction with the -o option, which indicates the line number to start at. Can also be used in conjunction with the -S option to create a partial CSV file for a single print file.
Example: bldxffiles -L sjohnson CU201JOB1 -o28
The generated CSV file will be CU201_partial.csv
|-C||Create CSV file|
|-P||Create PDF file|
|-R||Create a print file with left-to-right orientation|
|-T||Include total groups in the CSV|
|-S||Build files only for the path and print file specified.|
|-k||Use the specified separator as the value separator in CSV files:
· a = Tab
· b = Space
· c = Comma
· s = Semicolon
|-o||Specify the line number to start for a partial CSV file (used in conjunction with option L)|
|UserName||If you are not using the -F option, this is the name of the user who created the print files to be converted.
If you are using the -F option, this is a placeholder only. You must specify some text for this, but it can be any character.
|JobName||The name of the job whose print files are to be converted.|
|JobStep||The job step whose print files are to be converted.|
|FullFilePath||The path to the input file, if you are using the -S or -Foption.|
|FileName||The name of the input file, if you are using the -S or -F option.|
Using it in IPA
In the properties of the System Command Node, put your bldxffiles command. Typically you will want to use the “S” option so you can run the command against a filename, rather than being tied down to a specific user and job name (which could change readily). The syntax for converting a prt file to CSV would be similar to:
bldxffiles -SC <NTID> “<filepath>” <filename>
The NTID is for Windows only. On a Unix system, it would be a username.
NOTE: if you are not running the bldxffiles command in the user’s print directory (i.e. you are working on a print file that has been moved to another location), you will need to make sure that you have the print file’s corresponding detail file in the same directory where you are running the command. For instance, if you have a print file named AP520.prt, there is a correspoinding AP520.dtl file in that same print directory. That needs to follow the prt file or your command will return an error.
Here is a command that will convert a check file to CSV format using the Lawson’s NTID. In this case, the file has been moved from the print directory to a new directory, so the .dtl file was also copied over. The .dtl file is also named checkf.dtl.
Microsoft is hardening their security with LDAP channeling and LDAP signing in an update coming soon. Any applications that rely on LDAP connections to Active Directory Domain Services (AD DS) or Active Directory Lightweight Directory Services (AD LDS) need to be converted to LDAPS. LDAPS is a secure connection protocol used between applications like Lawson and the Network Directory or Domain Controller. Below are the potential impacted Lawson applications mentioned by Infor in a recent KB Article.
Impacted Lawson applications:
- Lawson System Foundation (LSF) environments using AD LDS instances for Authentication Data Store (RM Configuration).
- Lawson System Foundation (LSF) environments using an LDAP Bind to Windows Active Directory for authentication.
- Landmark Environments using an LDAP Bind to Windows Active Directory for authentication.
- Infor Federated Services (IFS) synchronization connections to Active Directory.
Infor has recommended that on-premise clients configure the impacted applications and have provided KB Articles on how to perform these tasks.
Some important things to note:
- This change does affect you, even if you have implemented AD FS
- If you are using Microsoft Add-ins for LSF and Lawson Process Administrator for Landmark, you will have a Thick Client installed that used LDAP Bind.
- If your networking team takes the Microsoft LDAPS update and enforces LDAPS connections before these changes have been configured, your Lawson applications will fail in the following ways:
- The LASE process on LSF will fail to start.
- Login to services that rely on LDAP bind will be unable to login (Landmark Rich Client, MSCM Handheld Devices, IPA Flows to LSF).
- IFS will be unable to sync users from Active Directory.
- This change will NOT impact DSP applications
- DSP application include Infor Business Intelligence (IBI or LBI), Lawson Smart Office (LSO), Mobile Supply Chain Management (MSCM), etc.
- These applications use Infor Lawson for authentication. They are not bound to LDAP, nor do they have their own instance of AD LDS.
- You can update your configuration at any time
- The changes recommended by Infor can be completed before LDAPS connections are enforced, and there will be no negative impact to your system.
Many ERP systems allow for a much longer location ID than Lawson does.
Many Punch Out vendors have limitations on the size of the location ID. Make sure to validate with any Punch Out vendors you may have before you come up with a new location naming convention. Otherwise you may have to revisit this item later.
If you are performing a blddbdict as part of database changes, or as part of a product line copy, and you receive the errors “Dictionary Not Built – Fix Errors And Rebuild”, perform the following commands in LID:
After that, your blddbdict command should be successful.
The Lawson query node can be used to query Lawson data using the DME format. In the properties of the node, click “Build” to build your query. Select the product line from which you are trying to get the data. Select the module and the table name where the data resides. Choose the fields that you want to see. You will also have an opportunity to choose related fields from other tables (i.e. a description from a parent table). You can also use an index and provide keys for your query, using either hard-coded values or an IPA variable. On the Criteria tab, you can select other fields besides key index fields to narrow your results. You can use the Test tab if you don’t have any variables in your query.
Use the Landmark Transaction node to query or update Landmark data. In the properties window, select “Build” and you will be presented with a wizard to help you build your Landmark query.
Select the data area that you are querying/updating. Select the Module and Object Name. (HINT: these values can be found by using Ctrl+Shift+Click on the form in Rich Client or the Landmark Web UI).
Choose your action. There are basic CRUD (create, read, update, delete) actions for each object, and there will be more actions specific to the object you selected. Action Operator will likely be “NONE”. Select your action type (SingleRecordQuery, MultipleRecordQuery, etc.), and finally select the criteria. Click OK.
Decide whether to use the hardcoded values for your transaction field values. You can supply variables here to make your flow more portable.