Cloud computing is no longer just a back-office utility quietly managed by IT—it’s now a major financial focus for organizations. According to ERP Today’s Maneesha Tiwari in her report for 2026, based on a survey of 300 U.S. CFOs and senior finance leaders, rising cloud costs are now a board-level concern. Companies are racing to fund AI and automation initiatives, but cloud spending is climbing, and efficiency is lagging. The survey found that 88% of organizations report higher cloud costs, while 69% of finance leaders believe 10–30% of that spend is wasted. CFOs are stepping up, working closely with IT to track usage, optimize workloads, and modernize systems. The goal isn’t just cutting costs—it’s freeing up budget for growth and innovation. Metrics like cloud utilization, total spend, and costs as a percentage of revenue are now key performance indicators. For ERP-heavy organizations, this is especially critical. With finance, supply chain, and other core workloads running in the cloud, efficiency directly impacts margins and the ability to scale next-generation capabilities. Cloud is no longer invisible—it’s a strategic lever for driving innovation while protecting the bottom line.

 

For Full Article, Click Here

If you’re managing IPA flows daily, you may be used to working in Process Server Administrator. This gives you access to workunits, configuration sets, File Channels, among other features depending on your security.

Now if you’re logged into Process Server Admin as your own user, you don’t have access to scheduled IPA jobs. Creating new ones will tie them to your user, thus if you ever leave the organization, they all will fail. So, scheduling new IPA jobs under the system user require the system user password.

 

However, the good new is you could still edit these existing IPA scheduled jobs under the system user.

In order to edit them, you need to Async Administration Access, click the Process Server Admin drop down box and select Async Administration.

Select Request Management tab, then under Class, search PFITrigger and press enter

Now you will be able to view all the IPA Scheduled jobs, just double click them to view and edit them.

Note the Mapping field 1 will likely show the IPA Process Name since the job name may be named poorly.

 

Enterprise Resource Planning (ERP) systems have long kept businesses running smoothly, but Artificial Intelligence (AI) and automation are redefining their role. In an article on Dallas Business Journal authored by INSPYR Solutions, modern ERP platforms are becoming smarter, more predictive, and more adaptive, giving organizations a competitive edge in today’s fast-paced environment. AI and machine learning turn ERP systems into proactive tools. They analyze patterns, detect anomalies, and predict outcomes—from maintenance needs to cash flow fluctuations—allowing teams to address issues before they arise. Intelligent forecasting goes further, using real-time data to improve demand planning, resource allocation, and operational resilience. Automation also transforms workflows. Tasks like invoice processing, payroll, and compliance reporting can now run automatically, reducing errors and freeing staff for higher-value work. Predictive analytics elevate decision-making, surfacing insights such as production bottlenecks, sales trends, and cost-saving opportunities. Resource optimization is a key benefit. AI-driven ERPs schedule staff based on demand, streamline inventory, and recommend operational improvements, boosting efficiency and financial performance. Across the organization, these tools empower smarter, faster, and more confident decisions. To fully leverage AI-enabled ERP, organizations must prepare by ensuring high-quality data, modernizing workflows, and training teams. Partnering with ERP specialists helps implement and optimize these systems effectively. Moreover, AI-powered ERP isn’t just a tech upgrade—it’s a strategic advantage. Companies that embrace these tools now will gain efficiency, agility, and better-informed decision-making, setting themselves up for long-term success.

 

For Full Article, Click Here

Manufacturers face a growing challenge: keeping their increasingly complex systems—ERP, CRM, MES, WMS, eCommerce, and shop floor tools—connected in a way that supports growth and change. ERP Today’s Jake Rohrer and Rachana Dalmia explore how manufacturers can navigate this complexity and choose the right integration strategy and framework. Traditionally, many organizations rely on point-to-point integrations—direct connections between systems. While simple initially, this approach becomes fragile as operations evolve, causing rework, manual interventions, and delayed initiatives. The hidden costs of these fragile integrations often outweigh the upfront savings, especially when adding AI, analytics, or automation initiatives. Enter integration platform as a service (iPaaS). By centralizing integration management, iPaaS provides visibility, governance, and scalability, reducing operational risk and supporting faster adaptation to change. The authors emphasize that the choice isn’t binary: point-to-point can still work for small, stable environments, while iPaaS is essential when multiple critical systems, plants, or business units need reliable, operationally critical data exchange. Manufacturers should consider factors like system complexity, frequency of change, growth in applications, and operational visibility when deciding their approach. Many find a hybrid strategy—using point-to-point for stable processes and iPaaS for complex, dynamic needs—offers the best balance. Ultimately, integration isn’t just technical—it’s strategic. A thoughtful approach allows manufacturers to modernize systems, adopt new capabilities, and respond to disruption confidently, turning integration from a constraint into a competitive advantage.

 

For Full Article, Click Here

When working with Perl and MySQL, one of the most reliable ways to establish a connection is through ODBC (Open Database Connectivity). This approach provides flexibility, as ODBC acts as a bridge between your Perl application and the MySQL database. In this post, we’ll walk through how to set up a simple Perl script that connects to a MySQL database using ODBC.


Prerequisites

Before we start coding, make sure the following are in place:

  1. Perl installed on your system.
  2. MySQL ODBC driver configured with a DSN (Data Source Name).
    • On Linux/Unix, you can configure this in /etc/odbc.ini.
    • On Windows, use the ODBC Data Source Administrator.
  3. Perl modules:
    • DBI — a generic Perl database interface.
    • DBD::ODBC — the ODBC driver for DBI.

You can install the modules via CPAN if they aren’t already installed:

cpan DBI

cpan DBD::ODBC


The Perl Script

Here’s a simple Perl script that demonstrates how to connect to a MySQL database using ODBC:

#!/usr/bin/perl

 

use strict;

use warnings;

use DBI;

 

# Define the DSN, username, and password

my $dsn = ‘DBI:ODBC:your_dsn_name’;

my $username = ‘your_username’;

my $password = ‘your_password’;

 

# Connect to the database

my $dbh = DBI->connect($dsn, $username, $password, {

RaiseError => 1,

PrintError => 0,

AutoCommit => 1,

}) or die “Failed to connect: ” . DBI->errstr;

 

print “Connected to the database successfully!\n”;

 

# Example query

my $sql = ‘SELECT * FROM your_table_name’;

my $sth = $dbh->prepare($sql);

$sth->execute();

 

# Fetch and display results

while (my @row = $sth->fetchrow_array) {

print join(“, “, @row), “\n”;

}

 

# Clean up

$sth->finish();

$dbh->disconnect();

 

print “Disconnected from the database.\n”;


Breaking It Down

  1. Connection String:
    The $dsn specifies the ODBC DSN you’ve configured, prefixed with DBI:ODBC:. Replace your_dsn_name with the DSN you set up for your MySQL database.
  2. Connection Attributes:
    • RaiseError => 1: Automatically die on errors.
    • PrintError => 0: Prevents warnings from being printed automatically.
    • AutoCommit => 1: Ensures changes are committed immediately.
  3. Query Execution:
    The script prepares a simple SELECT * query, executes it, and prints the results row by row.
  4. Cleanup:
    Always call $sth->finish() and $dbh->disconnect() to release resources properly.

Why Use ODBC?

Using ODBC adds a layer of portability to your applications. If your database backend changes in the future, you can update the DSN and driver without rewriting large portions of your Perl code. This is particularly useful in environments with multiple types of databases or when migrating systems.


Final Thoughts

With just a few lines of Perl code, you can connect to a MySQL database using ODBC and start running queries. While this example demonstrates a basic SELECT, the same connection can be used for inserts, updates, and more complex operations.

Whether you’re maintaining legacy Perl applications or building new scripts to interact with MySQL, ODBC gives you a stable, flexible way to manage database connectivity.

 

In today’s fast-moving tech landscape, AI-driven security and compliance tools promise speed and efficiency—but at a cost, warns Emil Sayegh (serial technology CEO) in a recent Forbes article. As organizations face rising regulatory requirements and heightened customer expectations, platforms that automate compliance—using AI for monitoring, evidence collection, and documentation—are becoming essential. However, Sayegh cautions that the rush to accelerate compliance can mask underlying risks.

The core issue is the focus on outputs rather than actual operational integrity. Compliance isn’t probabilistic; it demands precise, verifiable controls. Recent events, such as the Delve incident, highlight how quickly gaps between represented and actual compliance can erode trust, impacting customers, partners, and legal exposure, especially in federal contracting contexts. AI and automation are valuable for efficiency but cannot replace rigorous validation and oversight. Organizations must maintain visibility into control implementation, ensure alignment between representations and reality, and incorporate independent checks. Compliance should be an operational condition, not just a report or platform output.

Sayegh’s takeaway is this: speeding up compliance is necessary, but organizations must pair automation with accountability. Transparency, traceability, and hands-on validation are key to ensuring that AI-driven processes enhance security rather than create hidden liabilities. In short, tools can help, but responsibility—and risk—ultimately remain human.

 

For Full Article, Click Here

Summary: You have an email node going out to users with an attachment from a network path, but it seems to be failing.

Here is the email node, notice the network path by \\

This same email node will fail when you try to add an attachment like this by saying the file “does not exist”

One way to properly attach a file to the email node is to use IPA FileAccess nodes to read and then write the file to Landmark so it could be attached locally.

  1. FileAccess Node 1 (read) will need to be set to the Main configuration so it can read the file from your LSF server (assuming that is where your file exists).
  2. FileAccess Node 2 (write) will need to be set to the System configuration so it can write it to your Landmark server. If your file will be on landmark automatically or by some other means, skip this step.
  3. Now simply remove your existing \\ attachment network path and add a local Landmark path like D:\lawson\fileAttachmentName.txt

Done! Now just run and test it to make sure it works.

Enterprise Resource Planning (ERP) migrations often fail not because of the software, but due to poorly managed data, processes, and people. In a recent ERP Today post, senior editor Chris Vavra discusses how Aptean is helping organizations avoid these common pitfalls with structured guidance, case studies, and the right tools. Data quality and scope creep are two major threats to ERP success. Many companies try to migrate all historical records, only to face issues with duplicates and incomplete data that disrupt the process. Aptean recommends a disciplined four-step strategy for migration: define scope, build a cross-functional team, standardize and map data, and test in small increments. This approach helps ensure that data governance is solid from the start, preventing costly mistakes post-migration. Process complexity is another hurdle. Aptean urges businesses to use the migration process as an opportunity to streamline and adopt industry best practices, instead of carrying over legacy customizations. This reduces technical debt and makes future upgrades easier. Additionally, effective change management is crucial. Many companies fail to account for the “tribal knowledge” employees have, which can resurface as shadow systems. Aptean’s hands-on approach to change management, including realistic timelines and pilot groups, ensures smoother transitions and better adoption. For technology leaders, moving to a cloud-based ERP system offers significant benefits, including improved security and easier upgrades. However, selecting the right vendor requires evaluating their migration methodology, industry-specific templates, and post-go-live support.

 

For Full Article, Click Here

The term “SaaSpocalypse” – referenced by Rick Rider, SVP of product management for Infor, refers to the predicted upheaval in enterprise software caused by artificial intelligence (AI) agents, which are expected to replace many point SaaS applications. In an interview with Jordan Berger, SVP, TMT Market Intelligence for AlixPartners, ERP Today’s Chris Vavra explains that while this scenario sounds dramatic, the real impact is more nuanced: lightweight, UI-heavy SaaS tools are vulnerable, but core enterprise resource planning (ERP) systems—handling critical data, compliance, and audit trails—remain essential. Berger notes that AI agents are already automating low-judgment workflows like HR intake, procurement approvals, and routine reporting. ERP platforms that thrive will act as stable, governed transaction cores with clean APIs, identity integration, and fine-grained permissioning, while vendors relying mainly on configurable UIs and generic workflows face the greatest risk.

This environment is accelerating demand for modular, API-first ERP architectures rather than full-suite replacements. Enterprises are focusing on making ERP programmable and AI-ready, building narrow, differentiating capabilities in-house while relying on ERP for compliance and integration. Industries with structured, routine work—professional services, retail HQs, and tech—are experiencing the fastest ERP workflow shifts, whereas highly regulated sectors like healthcare, energy, and finance will change more gradually due to compliance and operational constraints. In the post-“SaaSpocalypse” world, ERP and critical industry systems that control data, enforce governance, and enable agent orchestration will dominate, while standalone SaaS point solutions lose relevance. Organizations investing in AI-ready ERP foundations, flexible integrations, and automated governance will capture the most value and maintain a competitive edge.

 

For Full Article, Click Here

To view and manage favorites in the Infor Lawson Portal, follow these steps below.

From the home page, select the hamburger icon beside Menu > User Options and select the Favorites tab to view the Favorites configuration.

 

URL / Token

 

The URL is a direct link to an external site, and the token is what’s available to your user, within a specified data area and system code

Click Save to add the new favorite item to your Favorites list. You can also click “Edit Favorites” directly on the home page, then click the Add icon to create new favorites, such as URLs, tokens, or custom forms.