Introduction:
In today’s fast-paced business environment, companies often undergo digital transformations, migrating from legacy Enterprise Resource Planning (ERP) systems to modern solutions. However, legacy ERP applications often hold valuable historical data that must be preserved for regulatory compliance, historical analysis, and potential future use. Application archiving projects often fall under the responsibilities of IT Application managers at most companies. In this article, we will explore how to effectively archive data for long-term storage, the methods for storing archived data, and data retention strategies that align with the unique needs of large-scale enterprises.

 

How to Archive Data for Long-Term Storage:
Archiving data for long-term storage involves preserving valuable historical records while ensuring data integrity, security, and accessibility. IT Application Managers should consider the following best practices:
a. Data Segmentation: Identify the data that needs to be archived, separating it from the active dataset. ERP applications like Infor Lawson, Oracle, SAP, or PeopleSoft may contain vast amounts of historical data, and careful segmentation ensures the right data is archived.b. Data Validation: Before archiving, ensure data accuracy and completeness. Running data validation checks will help identify and rectify any inconsistencies.c. Compression and Encryption: Compressing and encrypting the archived data optimizes storage space and enhances security, protecting sensitive information from unauthorized access.
d. User access: Perhaps the most critical component of any viable data archive solution, is how accessible it is by users. The right solution should enable users to access archived records without the need to involve IT.

e. Role based security: Of course with ease of access comes responsibility. The right solution needs to enforce well defined security roles to ensure that users are accessing the data they have access to and nothing more.

Methods for Storing Archived Data:
IT Application Managers have various options for storing archived ERP data, including:
a. On-Premises Storage: Keeping archived data on local servers allows for complete control over the storage environment, ensuring immediate access to data when needed. Of course on-premises storage can be costly due to infrastructure investments and ongoing maintenance. Additionally, it may face limitations in disaster recovery options and data accessibility for remote employees, potentially hindering seamless operations in geographically dispersed organizations.

b. Cloud-Based Storage: Utilizing cloud computing platforms for data archiving offers scalability, cost-effectiveness, and high availability. Cloud providers like AWS, Azure, or Google Cloud offer secure and reliable solutions for long-term data retention. However storing structured data in the cloud without a structured presentation layer falls short of meeting all of the requirements we previously stated.

c. Cloud-Based Application: Combining the benefits of cloud computing with a well thought-out presentation layer is the ultimate way to address the challenges of ERP data retention. This option provides the freedom to decommission and eliminate on-premise servers while maintaining data integrity and providing users an easy way to continue accessing the data in the cloud.

Data Retention Strategies:
Data retention strategies aim to define the retention period for archived data, ensuring compliance with industry regulations and business needs. IT Application Managers should consider the following approaches:
a. Legal and Regulatory Requirements: Compliance with industry-specific regulations, such as HIPAA, GDPR, or SOX, requires setting appropriate data retention periods to avoid penalties and legal issues.

b. Business Needs: Align data retention policies with the company’s specific business requirements and operational workflows.

c. User Stories: Understanding the needs of your subject matter experts is key to archival success. The SMEs understand the real data needs of the business, audit requirements, and what information needs to be readily accessible.

 

Conclusion:
Archiving old ERP applications is a crucial responsibility for IT Application Managers in large-scale enterprises with extensive data centers and cloud computing operations. By understanding how to archive data effectively, selecting a suitable solution, and implementing data retention strategies, organizations can preserve valuable historical information securely and compliantly. As companies continue to evolve and modernize, the archiving process becomes a strategic investment in the long-term success of their operations. For IT Application Managers seeking a comprehensive and reliable archiving solution, APIX offers a user-friendly, cost-effective, and secure data archiving platform tailored to the unique needs of large enterprises. To learn more, visit https://www.nogalis.com/lawson-data-archive/

This article covers the steps for configuring Lawson to automatically generate all file types (XML, etc) for each batch job that is run.

 

In LAWDIR/system, you will need to look for a rpt.cfg file.  If the file doesn’t exist, then you must create it.

Next, to generate XML for all batch jobs, you must set RUNTIME_XML to ON.  For CSV, the configuration type is RUNTIME_CSV.

See below for a sample file:

lawdir/system/rpt.cfg

There is no restart or downtime required for these changes to take effect. Simply restart the system and your new changes will be applied to your batch jobs moving forward.

 

To adjust the maximum heap size of a node in Landmark Grid, you first need to be logged into the Grid Management Console. From there, follow these steps:

  1. Click the Configuration Manager link  (the gears icon located at top right corner).
  2. Next, click “Applications”.
  3. Select your Landmark deployed application.
  4. Click “Edit Properties”.
  5. Under Grid Defined Properties,  click “Max Heap”.
  6. Select the “All” Radio Button.
  7. Find the node you wish to adjust, and click on value it is currently assigned.
  8. In the popup window, enter a new value and click save.
  9. In the top left corner of the screen, click save again to apply the changes.
  10. Click the home link in top right corner.
  11. Click the stop link (black box in top right corner of the node you adjusted)
  12. The system will shutdown that JVM then, automatically restart it for your change to take affect.

And you’re done. Good luck!

For whatever reason that you need a Lawson profile ID, below is a quick way to access it in LID.

  1. Login into LID and go to a temp directory you’re okay with dumping a file to.
  2. Run this command lsdump -f roledump.txt ROLE SuperAdminRole
  3. Go to the directory you dumped the file in and open roledump.txt in a text editor
  4. Search for ProfileID until you find the one you’re looking for

Chances are, SuperAdminRole is created with an admin class assigned to it. You can also view this file using LASHOW command without having to dump the file.

 

Good luck!

 

Many organizations opt to engage Lawson consultant teams for managing their Lawson Security. These consultant teams offer managed services at a fixed monthly rate and possess extensive knowledge and expertise. This service is particularly suitable for larger organizations, but smaller organizations that do not require a full-time Lawson employee on-site may also find it beneficial. Nogalis provides this service, and you can contact us via our contact page for further information.

 

Follow these steps to edit the domain name on the ADFS instance:

Update the Domain Name

  1. Open the ADFS Management application from the ADFS server.
  2. On the right, select “Edit Federation Service Properties”.
  3. Change the Federation service name and identifier to reflect the new domain name.

Regenerate the Token Certificates

  1. Open a PowerShell session on the ADFS Server
  2. Run “Update-ADFSCertificate”, which will generate a new token-decrypting and token-signing certificate.
  3. The old certificate will remain primary on the instance and cannot be deleted until a new primary is selected.
  4. In PowerShell, run the command “set-ADFSProperties -AutoCertificateRollover $false”
  5. Now you can right-click the secondary (new) certificates and set them as primary.
  6. Delete the old certificates.
  7. Reset the rollover option in PowerShell: “set-ADFSProperties -AutoCertificateRollover $true”

Deploy the new Server Certificate

  1. Get the Thumbprint value on the new certificate for the new domain.
  2. In PowerShell, run command “set-ADFSSslCertificate -thumbprint <value you saved in step 11>”
  3. Bounce the ADFS service

Your ADFS domain/URL has been updated!

 

To clear an existing sync lock in order to rerun the sync from the beginning again

From ssoconfig -c:

Login to the Infor Security Services web page

On the menu bar navigate to Federation > Manage Locked Process

If there is a process listed, make a note of the process that is locked

From the command line, run the ssoconfig utility. Type ssoconfig -c

At the prompt, enter the ssoconfig password

Select Manage Locked Processes

Select the number of the process that needs to be unlocked

Once it has been cleared, a message will appear that the process has been cancelled

Select the number that corresponds to Exit to return to the ssoconfig menu

Select EXIT at the ssoconfig utility

From the ISS web page:

Login to the Infor Security Services web page

On the menu bar navigate to Federation > Manage Locked Process

If there is a process listed, kindly check the box and hit the “unlock” button (Note: There may be more than one locked process, but you only need one to unlock, and all will be unlocked).

 

These 5 tips could make applying patches that much less stressful and are also good practice in general.

 

Tip 1: Check existing patch logs to see if a patch has already been applied previously and current versioning. This is good to check after a patch has been applied as well.

These logs can be found and generated here in LID:

perl %GENDIR%\bin\patches_installed_report <productline>

perl %GENDIR%\bin\source_versions_report <productline>

 

Tip 2: Restart the LSF server (or services) to ensure no processes are being held up and when it boots up, turn off Websphere LSF Appserver service before applying a patch to ensure users cannot log on, especially if patch needs to be applied during or close to work hours.

 

Tip 3: Run dbdef command to make sure there is a connection to the database before patching

 

Tip 4: When activating or staging multiple patches, run this command to speed up the post compile process:

qcontrol -jlocal,4  – This will set the servers cores to 4 when processing form compiles. Set it back to 2 when done. You can also check the status of the compiled jobs with command: qstatus | head -5

 

Tip 5: If a Plus dictionary is created after patching, its typically good practice to compile the entire product line with the command: cobcmp (be aware this can take up to 20-30 minutes to complete, tip 4 helps with this). This ensures that all programs are functioning correctly before passed to testers.

 

Bonus Tip: Verify security is on before sent to the testers!  Hope these were helpful.

 

If you require assistance with applying patches for your v10 system, it is common for organizations to engage Lawson consultant teams for managed services, which are available at a fixed monthly rate. These consultant teams possess significant knowledge and expertise and are suitable for larger organizations. Additionally, smaller organizations that do not require a full-time Lawson employee on-site may also find this service advantageous. Nogalis offers this service, and you can contact us through our contact page for further details.