Building your own native ERP integration offers greater control, performance, and long-term cost savings by avoiding the sync issues, delays, and limitations common with third-party middleware solutions. In an article written by Ilze-Mari Grundling on DesignRush, she outlines three key reasons why building your own ERP integration may be more advantageous than relying on third-party middleware solutions:

  1. Customization and Control: Developing a custom ERP integration allows for tailored solutions that align precisely with your business processes, ensuring better performance and reduced risk of synchronization failures.
  2. Cost Efficiency: While middleware solutions may seem cost-effective initially, they can lead to hidden expenses due to synchronization issues and performance bottlenecks, making custom integrations potentially more economical in the long run.
  3. Scalability and Performance: Custom integrations can be optimized for your specific needs, leading to improved performance and scalability, whereas middleware solutions may introduce latency and become bottlenecks as your business grows.

The article emphasizes that while middleware can offer quick fixes, building a custom ERP integration provides greater control, cost savings, and scalability, ultimately supporting long-term business growth.

 

For Full Article, Click Here

When running Lawson Security Administrator (LSA), you may get stuck on the authentication screen even after going through the two-factor sign-in. Here is how to resolve this issue.

 

Run LSA as Admin and login

Ping ID Authenticates

 

Script Error pops up, clicking yes, no or closing leads to same result:

Doesn’t connect after:

LSA Logging gave me these results 9/20 when last tested (logs not updating now):

LSA Version 10.0.3.01065

 

Installing LSA Version 10.0.3.01069 Resolves this issue. And that’s it!

Today’s CIOs and CDOs face mandatory digital transformation, but many are hindered by data debt stemming from outdated, fragmented, and redundant information that stalls progress amid efforts to adopt cloud ERP (enterprise resource planning), AI (artificial intelligence), automation, and seamless user experiences. Mark Vigoroso, CEO of ERP Today, shares an article that discusses how data debt is a significant obstacle to successful digital transformation for organizations. Data debt accumulates when companies delay cleaning, integrating, or properly managing their data, leading to inefficiencies and increased costs. CIOs often prioritize new technology deployments over addressing existing data issues, unintentionally compounding data debt. This buildup hampers decision-making, agility, and innovation, as unreliable data underpins critical business processes. The article emphasizes that tackling data debt requires a strategic approach, including establishing data governance, investing in data quality tools, and fostering a culture of data literacy. It warns that ignoring data debt can lead to a “digital transformation trap,” where efforts are hampered by poor data foundations. Successful organizations proactively manage their data assets, viewing data quality as a continuous, integral part of digital initiatives. Leaders should align IT and business teams to identify data issues early and develop a roadmap for remediation. Vigoroso highlights the importance of automation and modern data platforms to streamline data management tasks. It also stresses the need for clear metrics to measure data health and progress. Overcoming data debt enables organizations to unlock the full value of their digital investments, improve customer experiences, and gain competitive advantage. Ultimately, addressing data debt is essential for sustainable digital transformation, rather than a one-time fix.

 

For Full Article, Click Here

IPD  enables you to debug your processes. To activate the Debug view, go to the Infor Process Designer menu bar and select Window > Show View > Other

From there select Infor Process > Debug and click on OK.

The Debugger feature set includes:

  • Breakpoint management- can be enabled for any activity node within a process, and causes a running process to pause at the point before the specified activity node. When the process pauses, process variables can be examined and/or modified. Process execution can be resumed at the breakpoint, or at any other activity node if desired.

To set this up, right mouse click on the node to debug and select Debug

The Run to this activity breakpoint is temporary and only valid for the current execution of the process. A “run to” breakpoint can be useful when you want to pause at a specific activity node during the current execution, but you do not want it to stay in effect for any subsequent executions. All temporary breakpoints are cleared before a process starts execution.

This feature is available only when running the process locally.

  • Process execution controls

The run process control starts the execution of a process and is available if the process is currently not running or paused. The Run Mode option gives you the option to run the process either locally or on the server. The flow can be run on the server from the designer and provide you with the status and log information.

Specify Input data can be used to pass data to a process. The data will be added to the workunit once the workunit has been created. You can select from these options when specifying the input data:

  • No input data: Select this option if you do not want to pass any specific data to the process.
  • Use data from Workunit: You must specify the workunit number. The process will fetch the input data from that workunit and use the same data in the current process. If the workunit or input data does not exist, then the input data will be null.
  • Use connector: You can select between two options. The first one is Specify input data where you can specify the data that you want to pass to the process. The second is Input data file where the data will be read from the given file and be used in the process.
  • Runtime variable examination and modification

This feature is available only when running the process locally (that is, Run Mode is set to Local).

Once a running process is paused at an activity node, the current value of variables in the process can be examined and modified if needed.

The Debug view shows a list of currently running processes. The variables accessible to the activity node at the pause point is shown, organized by variable category and activity node. Selecting a variable shows its current value, and a new value can be specified.

Implementing and utilizing cloud enterprise resource planning (ERP) systems enables organizations to enhance operational efficiency and sustainability simultaneously, helping businesses reduce their carbon footprint while maintaining profitability. In an article on TechTarget, tech writer Jim O’Donnell shares how Cloud ERP systems can support organizations in achieving their sustainability objectives. Cloud ERP offers real-time data insights, enabling better tracking of environmental impact and resource usage. By centralizing data, it facilitates more accurate reporting on sustainability metrics and compliance. Cloud solutions reduce the need for physical hardware, lowering energy consumption and carbon footprint. They also enable scalable operations, allowing companies to adjust resources efficiently and minimize waste. Integration with other digital tools enhances sustainability initiatives, such as supply chain transparency and waste reduction. Cloud ERP supports automation of processes, leading to increased operational efficiency and reduced resource waste. The flexibility of cloud platforms helps organizations adapt to changing sustainability standards and goals. Cloud ERP systems can facilitate better supplier management by assessing suppliers’ sustainability practices. They also enable scenario modeling to evaluate the environmental impact of different strategies. Cloud technology promotes collaboration across departments and with external stakeholders on sustainability efforts. Data security and compliance are strengthened through cloud-based solutions, ensuring integrity in sustainability reporting. Cloud ERP’s cost-effective nature makes it accessible for organizations of various sizes to pursue sustainability initiatives. Overall, cloud ERP acts as a strategic tool for organizations aiming to improve environmental performance and meet sustainability targets efficiently.

 

For Full Article, Click Here

Enterprise resource planning (ERP) project failures can result in massive financial losses and leadership shakeups—yet executive oversight remains lacking in many initiatives. With most ERP efforts exceeding budget, it’s critical for the C-suite to take a more active, strategic role. From funding decisions to aligning priorities, executive leadership directly shapes project outcomes. In an article on CIO.com, post writer and tech expert Ted Rogers explores how poor oversight increases risks and outlines key areas where focused executive involvement can dramatically improve ERP success. ERP programs are major investments often backed at the highest levels, yet they continue to suffer from delays and cost overruns—sometimes exceeding 50%. A lack of executive oversight is a key contributor, leading to scope creep and inflated budgets. Perceived uniqueness of processes can further drive up costs. Organizations that fail to involve leadership early and consistently risk project failure. To counter this, executives must stay engaged across five critical areas to guide programs toward success.

Effective ERP executive oversight goes far beyond surface-level planning—it requires deep involvement, formal agreements, and active governance. Real-world examples show that when executives are engaged, project risks are addressed early, scope stays controlled, and accountability is clear—even helping resolve vendor performance issues. Strong leadership can mean the difference between failure and long-term success. Rogers concludes that strong governance is essential for ERP success—it aligns goals, drives accountability, and ensures teams and vendors stay on track. For CIOs and digital leaders, treating executive oversight as a structured, ongoing responsibility—not just a one-time check-in—can be the key to avoiding costly disruption and achieving transformation goals.

For Full Article, Click Here

These 5 tips could make applying patches that much less stressful and are also good practice in general.

 

Tip 1: Check existing patch logs to see if a patch has already been applied previously and current versioning. This is good to check after a patch has been applied as well.

These logs can be found and generated here in LID:

perl %GENDIR%\bin\patches_installed_report <productline>

perl %GENDIR%\bin\source_versions_report <productline>

 

Tip 2: Restart the LSF server (or services) to ensure no processes are being held up and when it boots up, turn off Websphere LSF Appserver service before applying a patch to ensure users cannot log on, especially if patch needs to be applied during or close to work hours.

 

Tip 3: Run dbdef command to make sure there is a connection to the database before patching

 

Tip 4: When activating or staging multiple patches, run this command to speed up the post compile process:

qcontrol -jlocal,4  – This will set the servers cores to 4 when processing form compiles. Set it back to 2 when done. You can also check the status of the compiled jobs with command: qstatus | head -5

 

Tip 5: If a Plus dictionary is created after patching, its typically good practice to compile the entire product line with the command: cobcmp (be aware this can take up to 20-30 minutes to complete, tip 4 helps with this). This ensures that all programs are functioning correctly before passed to testers.

 

Bonus Tip: Verify security is on before sent to the testers!  Hope these were helpful.

 

If you found this article helpful, Nogalis provides managed services and expert technical resources to assist with Lawson patching and system maintenance. If applying patches feels overwhelming or time-consuming, our team can simplify the process by managing everything from version checks to database connectivity and post-compile optimizations. Let us help ensure your patches are applied efficiently and your system is running smoothly. Contact us to learn more about how we can support your Lawson environment.

Enterprises are increasingly adopting advanced technologies like Gen AI to address business challenges, but the core issue lies in bridging the gap between technical solutions and actual business outcomes rather than the technology itself. JaveedNizami, chief technology officer at Syniti, shares an article on The Fast Mode that emphasizes the importance of aligning data strategies with overall business objectives to drive growth and innovation. It highlights the historical disconnect between IT and business units, often resulting in missed opportunities. To bridge this gap, organizations should foster closer collaboration and shared understanding of data needs. Developing a unified data governance framework ensures data quality, security, and compliance across departments. Leveraging advanced analytics and AI can provide actionable insights that support strategic decision-making. Nizami advocates for a data-driven culture where leadership champions data initiatives and promotes data literacy among employees. Organizations invest in modern data infrastructure, such as cloud platforms, to enable scalability and agility. Regular communication between IT and business teams helps identify priorities and align efforts effectively. Implementing agile methodologies in data projects can accelerate value delivery. Data democratization empowers non-technical staff to access and utilize data, enhancing overall organizational agility. The article stresses the need for clear data governance policies to manage data privacy and ethical considerations. Building a robust data architecture supports integration of diverse data sources for comprehensive analysis. Continuous training and upskilling are essential to keep pace with evolving data technologies. Leadership’s active involvement ensures data initiatives align with strategic goals. Establishing KPIs related to data initiatives helps measure progress and impact. Embracing a proactive approach to data management can unlock new revenue streams and competitive advantages. Nizami concludes that a well-integrated data strategy transforms data from a support function into a strategic asset, fostering innovation and growth.

 

For Full Article, Click Here

The message “lase_server logging issue – Invalid message is received and size” indicates that an error message that is too large has been sent from the security server.

Example full message text

default.SEVERE server.SecurityEventHandler.run(): SecurityEventHandler #816 got exception. com.lawson.security.server.LawsonNetException: Got exception while reading from connection Socket[addr=/127.0.0.1,port=64706,localport=450000].

Stack Trace : com.lawson.security.server.LawsonNetException: Got exception while reading from connection Socket[addr=/127.0.0.1,port=6406,localport=50000].

at com.lawson.security.server.AbstractDefaultEventSource.read(AbstractDefaultEventSource.java:339)

at com.lawson.lawsec.server.SecurityEventHandler.run(SecurityEventHandler.java:151)

Caused by: java.io.IOException: Invalid message is received and size:300095616

at com.lawson.security.server.AbstractDefaultEventSource.readMsg(AbstractDefaultEventSource.java:351)

at com.lawson.security.server.AbstractDefaultEventSource.read(AbstractDefaultEventSource.java:352)

 

Resolution

The parameter server.readMsgMaxSize must be added to lsservice.properties.

The MsgMaxSize parameter ensures that a logging message that occurs when a vulnerability scan is run does not become so large that it causes the system to hang. The parameter catches very large messages and replaces them with shorter sample messages.

Perform these steps to add the parameter.

  1. Stop the Infor environment and the application server.
  2. Open the lsservice.properties file for editing.
    On Landmark, the location is: LASYSDIR/lsservice.properties
    On LSF, the location is: LAWDIR/system/lsservice.properties
  3. Add the parameter server.readMsgMaxSize=[size number in Bytes] to the end of the file. The default size is 20MB. Infor recommends not setting the size larger than 40MB.
    Example: server.readMsgMaxSize=20
  4. Restart the Infor environment and the application server.

 

Cloud ERP (enterprise resource planning) and AI (artificial intelligence) technologies are helping businesses tackle key sustainability challenges like carbon emissions, deforestation, and waste through real-time data, automation, and supply chain transparency. In a Forbes article written by Namita Gupta-Hehl, strategic marketing and communication leader at SAP, she explores how businesses are using cloud-based ERP systems and AI to meet rising sustainability demands. It emphasizes that sustainability is now a core business priority, with 85% of executives increasing related investments and 70% expecting climate risks to influence their strategies soon. Technologies like SAP’s Green Ledger and Green Token are helping companies reduce carbon emissions, prevent deforestation, and embrace circular economies. Carbon tracking, particularly for complex scope 3 emissions, is a major challenge, but cloud ERP platforms offer real-time data integration and visibility, transforming compliance efforts into strategic insights. The article also highlights the role of circular economy initiatives, where companies like Hilti Group use tools like Circelligence to assess and improve resource circularity. AI enhances this by automating ESG reporting and helping track sustainability progress. Overall, cloud ERP and AI are presented as essential technologies turning environmental challenges into opportunities, allowing companies to thrive while contributing to a sustainable future.

 

For Full Article, Click Here