Problem:

How do I delete or clear out the IBM WebSphere Application Server (WAS) temporary directories and cached files?

Summary:

This guide explains the process of removing or erasing the temporary directories and cached files in IBM WebSphere Application Server (WAS). It includes the appropriate situations for performing this task, the intended users, and the step-by-step instructions.

When is it necessary to perform this action?

If you encounter issues where the deployment manager, nodeagent, or application server fails to start, you can attempt the following steps. However, unless specifically requested by a support analyst, there is no need to carry out these actions.

 

Who should carry out this task?

System Administrators are responsible for executing these steps.

 

How is this done?

Follow the instructions below for each profile located within WAS_HOME/profiles (including Dmgr01 and AppSrv01, or whichever name your application server profile has).

 

  1. Stop the Deployment Manager, nodeagent, and application servers.
  2. Create backups of the existing configurations:
  3. cd PROFILE_ROOT/bin
  4. Run backupConfig
    1. Unix: ./backupConfig.sh backup_file
    2. Windows: backupConfig backup_file
    3. IBM i: ./backupConfig backup_file
  5. Repeat for every profile you have (Dmgr, AppSrv01, etc.)
  6. Rename the contents of the following directories or rename these temp directories. They will be recreated when you restart the servers.
  7. PROFILE_ROOT/wstemp
  8. PROFILE_ROOT/temp
  9. PROFILE_ROOT/config/temp (*** DO NOT REMOVE THE ENTIRE CONFIG DIRECTORY, JUST TEMP ***)
  10. Repeat for every profile you have (Dmgr, AppSrv01, etc.)
  11. Delete the javasharedresources directory:
  12. Unix and IBM i: /tmp/javasharedresources
  13. Windows:
    1. C:\Windows\System32\config\systemprofile\AppData\Local\javasharedresources
  14. From a command prompt or Qshell prompt, run the following command to initialize the OSGI configuration and clear the OSGI class cache:
  15. cd PROFILE_ROOT/bin
  16. Unix:
    1. ./osgiCfgInit.sh
    2. ./clearClassCache.sh
  17. IBM i:
    1. ./osgiCfgInit
    2. ./clearClassCache
  18. Windows:
    1. osgiCfgInit
    2. clearClassCache
  19. Repeat step 5 for the Dmgr01 profile and any other profiles present on your system.
  20. Start the Deployment Manager, nodeagent, and application servers.

 

Good luck!

Description:

To resolve the error message, “Restart the Server Express License Manager or License Manager is corrupt.” The errors are listed as compile error messages 191, 192, and 197, follow the troubleshooting steps outlined below.

 

Enter the command ps -ef|grep mfl to see if your License Manager is running. If the License Manager isn’t running, start it. If the License Manager is running, kill and re-start it by moving to the mflmf directory and entering the command sh ./mflmman.

 

If the license database is corrupt, go to the License Manager directory. (Note: The License Manager directory is the location where the license was installed.) Remove the following four files from the mflmf directory: mflmfdb, mflmfdb.idx, mflmfdbX, and mflmfdbX.idx. After these files have been removed, run License Administration Services (mflmadm) and re-install the licenses.

 

Follow these steps if you want to add or re-add developer licenses:

Use the cd command to move to the directory where License Manager was installed.

Execute the mflmadm program by entering the command ./mflmadm.

Press F3 (Install) to install the ServerExpress and/or the MicroFocus Cobol license.

When prompted, enter your key and serial number. ( Note: You must hit the slash ( / ) key twice.) Press Enter to save your key and serial number.

Press F3 (Install) to install and F7 (Refresh) to refresh. Press F5 (Browse) to see your ServerExpress license. Press F6 (More) to see both your ServerExpress and MicroFocus Cobol licenses.

Start the License Manager by going to the mflmf directory and entering the command sh ./mflmman. To verify that the License Manager is running, enter the command ps -ef|grep mfl. (If the License Manager is running, a root mflm_manager process should be returned.)

 

If the License Manager is still corrupt, remove the entire mflmf directory and use the cd command to move into the $COBDIR/lmf directory. Run lmfinstall. Select just the ServerExpress install option. You can either enter your developer serial number and license during this ServerExpress install OR you can enter them after the install has completed.

 

 

Follow these steps if you want to enter your developer serial number and license after your ServerExpress install is complete:

Use the cd command to move to the mflmf directory.

Run ./mflmadm.

Press F3 (Install) to install, and add your serial and license number.

Press F3 (Install) again.

Press F7 (Refresh) to refresh.

Verify that the License Manager is running by entering the command ps -ef|grep mfl. If the License Manager is running, a root mflm_manager process should be returned. If the License Manager isn’t running, move to the mflmf directory and run the command sh ./mflmman to start your License Manager.

If you have several files in the LAWDIR/productline/work directory that are taking up a lot of space and need to clean them up, most can be deleted, but be aware that the files with UPPERCASE file names are often used to transfer data to non-Lawson systems or ACH bank files, and they may be waiting to be used by a future process that has not run yet.

 

The following procedure is to clean up print files, work files, RQC files, user home directory files, and WebSphere trace and dump files by either running a program or by defining a recurring job that calls the program.

 

Automated Cleanup of Print Files, Work Files, and Other Files

Use this procedure to clean up print files, work files, RQC files, user home directory files, and WebSphere trace and dump files by either running a program or by defining a recurring job that calls the program. Before running the program or the recurring job, you must set up configuration files. These files enable you to set the cleanup options, exclude specific files from the cleanup process (by file name or by the user name associated with the files), and to specify date ranges for the files to be deleted.

The types of files to be deleted include:

  • Completed job entries
  • Completed jobs forms and the associated job logs
  • Batch report files
  • All print files from the print manager
  • Files from $LAWDIR/productline/work directory
  • All user-related files from the $LAWDIR/RQC_work, $LAWDIR/RQC_work/log, $LAWDIR/RQC_work/cookies directories
  • WebSphere trace and dump (.dmp, .phd, javacore and .trc) files that are in subdirectories of the <WASHOME>/profiles directory.

To clean up print files, work files, RQC files, and other files:

  1. Configure the prtwrkcleanup.cfg file.

You can edit the parameters in the prtwrkcleanup.cfg file in two ways:

    • By using the Lawson for Infor Ming.le Administration Tools. See Configuring Automated File Cleanupin the Lawson for Infor Ming.le Administration Guide, 10.1. (This option is only available in Lawson for Infor Ming.le 10.1.)
    • By directly editing the file in the $LAWDIR/system directory. See Directly Updating the prtwrkcleanup.cfg File.
  1. Configure the prtwrkcln_usrnames.cfg file.

This configuration file is divided into multiple sections:

    • Section 1: usernames list for print and completed job cleanup
    • Section 2: usernames list for RQC cleanup
    • Section 3: usernames list for users home directory cleanup.

The script uses each different section for a different cleanup job. Make sure to put usernames in the right sections to avoid undesired outcomes.

You can enter multiple usernames in either a comma-separated format or a line-break-separated format.

For example:

Username1,Username2,Username3…

Or

Username1

Username2

Username3

Note: Do not remove the section dividers.

  1. Configure the prtwrkcln_exclude.cfg file.

Use this file to specify a list of file names to be excluded from the work file cleanup process.

You can enter multiple file names in either a comma-separated format or a line-break-separated format.

For example:

Filename1,Filename2,Filename3…

Or

Filename1

Filename2

Filename3

  1. If you want to run the cleanup program just once, open a command line session and follow the substeps below. (Note that a recurring job may be more useful in the long term. See the next main step below.)
    • Ensure that the prtwrkcln executable exists in $GENDIR/bin.
      • In a command line session, navigate to $GENDIR/bin.
      • At the command line, enter the following command:

prtwrkcln.

  1. If you want to run the cleanup program via a recurring job, use the following substeps.
    • In Lawson for Infor Ming.le, navigate to BookmarksJobs and Reports > Multi-step Job Definition.
      • Specify this information to define a multi-step job.

Job Name

Specify a name for the multi-step job.

Job Description

Specify a description for the multi-step job.

User Name

Displays the name of the user defining the job.

Form

Specify prtwrkcln. (This assumes this form ID exists. Use the For ID Definition utility (tokendef) to check if it exists and to add it as an environment form ID if necessary.)

Step Description

Specify a description for the step.

      • Click Addto save the new job.
        • Navigate to Related FormsRecurring Job Definition. Define a recurring job according to the instructions in the “Recurring Jobs” topic in Infor Lawson Administration: Jobs and Reports.

Customizing the Lawson Ribbon can be a good idea for giving users a visual cue about which environment they are working in (see screenshot below).

 

Customizing the ribbon is as simple as changing one line of html code.  To update the ribbon image, you will need to open up the index.htm file at WEBDIR/lawson/portal/.  Next, navigate to the “topBanner” element and add a background image (you can use the “find” function to search for this faster if needed), setting the URL to the path where you saved your image (see below).

 

Save your changes and your ribbon will now be a custom view!

Introduction:

Migrating data from on-premises databases to the cloud is a critical step for organizations seeking to modernize their infrastructure and unlock the full potential of the cloud. Among the various tools available for data migration, the AWS Data Migration Service (DMS) stands out as a powerful and comprehensive solution. In this article, we will explore the benefits of using the AWS Data Migration Service and how it can simplify and streamline your data migration journey.

 

Seamless Data Replication:

One of the key advantages of using AWS DMS is its ability to perform seamless data replication from various source databases to AWS services. Whether you’re migrating from Oracle, Microsoft SQL Server, MySQL, PostgreSQL, or others, DMS supports a wide range of source databases. This flexibility allows you to replicate data in real-time or perform one-time full data loads efficiently, minimizing downtime and ensuring data consistency throughout the migration process.

 

High Data Transfer Speed:

AWS DMS leverages AWS’s global infrastructure and network backbone, enabling high-speed data transfer between your on-premises databases and AWS services. The service optimizes data transfer by parallelizing data extraction, transformation, and loading operations. This results in faster migration times, reducing the overall migration duration and minimizing the impact on your production environment.

 

Minimal Downtime:

Downtime can have a significant impact on businesses, causing disruptions, revenue loss, and user dissatisfaction. AWS DMS minimizes downtime during the data migration process by enabling continuous replication and keeping the source and target databases in sync. This ensures that your applications can remain operational while the migration is ongoing, with minimal interruption to your business operations.

 

Data Consistency and Integrity:

Maintaining data consistency and integrity during migration is paramount to ensure the accuracy and reliability of your data. AWS DMS provides built-in mechanisms to validate and transform data during the replication process. It performs data validation checks, handles schema and data type conversions, and ensures referential integrity, helping you maintain the quality and integrity of your data as it moves to the cloud.

 

Flexible Schema Mapping and Transformation:

Data migrations often involve schema changes and data transformations to align with the target database’s requirements. AWS DMS offers flexible schema mapping and transformation capabilities, allowing you to define and customize the mapping between the source and target databases. This empowers you to harmonize and optimize the data structure, format, and organization during the migration, ensuring a seamless transition to the cloud.

 

Continuous Data Replication and Change Data Capture (CDC):

AWS DMS supports ongoing replication and Change Data Capture (CDC), enabling real-time synchronization of your databases. CDC captures and replicates data changes as they occur, providing up-to-date data in the target database. This is particularly useful for scenarios where real-time data availability is critical, such as high-volume transactional systems or analytics workloads. With continuous replication, you can maintain a live replica of your on-premises database in the cloud, facilitating data-driven decision-making and minimizing the time gap between data updates.

 

Easy Integration with AWS Services:

AWS DMS seamlessly integrates with various AWS services, offering a range of options for your migrated data. For relational databases, you can leverage Amazon RDS, Aurora, or Redshift as target databases. For NoSQL databases, Amazon DynamoDB can be utilized. Additionally, you can take advantage of other AWS services like AWS Schema Conversion Tool (SCT) for automated schema conversion and AWS Database Migration Service (DMS) for homogenous database migrations. This tight integration simplifies the migration process and enables you to leverage the full capabilities of the AWS ecosystem.

 

Scalability and Cost-Effectiveness:

By migrating your data to AWS using DMS, you can leverage the scalability and cost-effectiveness of cloud services. AWS provides flexible scaling options, allowing you to scale up or down based on your workload requirements. This scalability eliminates the need for upfront hardware investments and enables you to pay only for the resources you consume, optimizing your cost structure and providing cost savings in the long run.

 

Conclusion:

The AWS Data Migration Service (DMS) empowers organizations to migrate their data to AWS securely, efficiently, and with minimal disruption. From seamless data replication to minimal downtime, data consistency, and easy integration with AWS services, the benefits of using AWS DMS are substantial. By embracing the power of DMS, organizations can unlock the full potential of the cloud, leverage advanced analytics, enhance data-driven decision-making, and embark on their digital transformation journey with confidence.

If you are unable to log into Lawson System Foundation (LSF) environment and getting the following examples of “LDAP error code 49” messages in the LAWDIR/system/security_authen.log.

June 24 13:26:43.779 EDT 2023 – default–539786713: [LDAP: error code 49 – 80090308: LdapErr: DSID-0C090439, comment: AcceptSecurityContext error, data 52e, v4563 ]

June 24 13:26:43.779 EDT 2023 – default–539786713 – L(2) : LDAP Bind failed. DN: CN=Infor,OU=Lawson,OU=Other,DC=us

[LDAP: error code 49 – 80090308: LdapErr: DSID-0C090439, comment: AcceptSecurityContext error, data 52e, v4563 ]

Stack Trace :

javax.naming.AuthenticationException: [LDAP: error code 49 – 80090308: LdapErr: DSID-0C090439, comment: AcceptSecurityContext error, data 52e, v4563 ]

June 21 13:25:17.805 EDT 2023 – default-1015973274: Error encountered while getting users DN. Please see logs for details[9xxxcsntmtl7k222uu027itela] Could Not Bind With privileged identity. User [[email protected]][LDAP: error code 49 – 80090308: LdapErr: DSID-0C090439, comment: AcceptSecurityContext error, data 775, v4563 ]

Stack Trace :

javax.naming.AuthenticationException: [LDAP: error code 49 – 80090308: LdapErr: DSID-0C090439, comment: AcceptSecurityContext error, data 775, v4563 ]

Resolution:

There are several values that can indicate what LDAP function is causing the issue, but usually the most helpful is the AD-specific error code after the word “data” as shown in the examples above where the error code is 52e and 775.

525      user not found

52e      invalid credentials

530      not permitted to logon at this time

531      not permitted to logon at this workstation

532      password expired

533      account disabled

701      account expired

773      user must reset password

775      user account lockedKeywords:

 

 

Follow these simple steps on how to resolve the Lawson Fax Integrator error(s):

“There was an error connecting to the server” or “The remote server returned an error: (530) Not logged in” (see screenshot below)

To resolve either of these, first navigate to the Servers >> Lawson tab, then make sure you use valid credentials for the FTP user.

Next, Under the Servers >> Portal tab, make sure that you have valid Lawson user credentials. If not, then enter the correct information.

Finally, click Update after entering your credentials. This will resolve the 530 FTP invalid credentials error. That’s all there is to it!