An alternative to a SQL database backup & restore is to use the Copy Database Wizard in SQL Server.

Right-click on the database, and select Copy. The source and destination can reside on different servers. You can opt to copy without taking the source database online. It will take longer, but it doesn’t require downtime. You can also copy to an existing database if you select the option to drop the database before adding if it exists. This tool is especially useful if you are trying to copy data from a new version of SQL Server to an older version such as 2008. We have come across that configuration many times in Lawson PROD to DEV environments.

Things to consider:

  1. Can you pay off all your existing AP prior to transitioning to a new system?
    1. This is preferred whenever possible and start your new ERP with fresh AP
  2. Consider the length of the address fields in your new ERP
    1. Do you need to come up with some abbreviations that work throughout your addresses in the new system?
    2. Have you used abbreviations you no longer need to use and therefore want to undo your abbreviations and make them full words in your new system?
    3. Do you have duplicate addresses that need to be consolidated prior to transitioning? If you are moving off of Lawson, Vendor locations often have duplicate addresses which other ERP systems can’t accept
  3. Cleaning up of inactive Vendors or addresses
    1. This is work that should be done now instead of waiting to the last minute. Paying a consultant to clean up your addresses is expensive.  Hire a temp now and get it done early and less expensive.
  4. Is there anything special about your AP processing that needs to be considered when changing ERP systems?

 

BL vs. BR What are the major differences?

As a summary, BR allows for different billing options that are not only item – Quantity times Price to invoice.

BL allows for creating automatic recurring invoices by running a job that creates the invoices.

BR has jobs that will book the revenue recognition as defined for the contract

 

Feature / Module BL

(Billing)

BR

(Billing & Revenue Recognition)

Using IC for item tracking Y N
Can bill ad-hoc Items Y N
Can Set up Products for billing Y – in IC Y
Can you Print an Invoice to send to the customer Y Performa only
Interfaces to AR Y Y
Requires Activity Module N Y
Auto Create Revenue Recognition entries N Y
Uses Allocations to create Revenue Recognition Entries Y N
Has an Invoice Entry form that looks like an invoice Y N
Create Recurring Invoices Y for various intervals N
Cost Plus Billing Y – based on pricing Y
Pass Thru Billing Y Y
Time and Materials Billing Y Y
Units of Production Billing Y Y
User-Defined Billing Needs to have an item with Quantity times rate associated with invoiced lines Y
Milestone Billing Y- if there isn’t a system generated trigger for the milestone, any invoice item quantity*Price invoice can be created Y – needs to be triggered by something

 

Sometimes complex IPA flows can become cumbersome to maintain, or they can experience degraded performance for large record sets. One way to combat these issues is to create custom scripts within your flows to do the “heavy lifting” of the process. Our favorite scripting language to use is Perl, because it is simple, doesn’t have to be compiled, and is already installed on your Landmark and LSF servers. We use a File Access node to create the script at runtime. This works best because it is more secure and subject to versioning (remember, Perl doesn’t have to be compiled), and you can put flow variables right into your script. To do this type of scripting, you will create your script in a File Access node, and then run it in a System Command node. It is important to note that this will not work in a cloud multi-tenant environment, because system command nodes are not allowed.

To maintain Landmark Security (classes and roles), in the Gen environment in Rich Client, go to Start > Configure > Security.  If you don’t see the Security or Configure, you will need to have your Security Administrator give you access to it.  The Infor delivered role for this is “ConfigConsoleSecurityadmin_ST”.  Have that role added to your account and wait about 30 minutes for the sync to complete.

One way to control the “clutter” on your more complex IPA processes is to utilize the concept of “Dynamic Commands”.  Many times your flows will follow a pattern of reading some sort of data, validating/manipulating the data, then taking action on that data.  You can move much of this work into Stored Procedures and set up a configuration table to tell your flow which command to execute next.  This method also allows for more granular error handling and logging.

Here is a sample configuration table and flow.  The flow will read this in and follow each step in order, and use a SQL Query node to get the command text.  The Action field shows what type of action will be completed once the stored procedure brings back the command.

When changing level addresses on GL20 or AC10 make sure to specify all the level addresses, not just the ones that might have changed.

When you create the file for uploading, make sure to have the first level first, the second level next and so on until all levels are rebuilt.  If you try to change an address to one which has upper levels that did not exist in the old system, you will get errors.

Just like building the level addresses, you start with level 1, add a level 2 to an existing level 1 and so on.  Same thing when you rebuild the addresses – make sure to rebuild level by level.

After AD FS is implemented for your Landmark and LSF environments, Landmark will need to connect to the LSF server using the thick client URL and the user principal name of your admin account.  This means that all of your LSF connections will need to be updated in Rich Client, including the Infor Lawson Connection, File Activity Connection, System Command Connection, and Web Run Connection.  You will need to update the Web Root in each of these connections to the Thick Client URL, which is most likely your LSF server URL with port 1447 (i.e. https://lsf.company.com:1447).  Check with your installer to verify the port.  The User will need to be updated to the UPN value, for instance lawson@company.com.

The Chart of accounts is recommended to be consistent for GL and Activities.  The Activity field, when populated, already differentiates the JE entry line from one to the same account without an activity specified.  This allows for easily determining how much was booked to a specific account and also split the bookings to a specific account into project related or not.

Account Categories in the AC module also allow you to have a separate chart of accounts, if desired, from your GL Chart of Accounts.

Since an account category is required when an activity is entered, it allows for a separation of the GL and AC charts if separation is desired.

Many users default in the Account Category so this value doesn’t have to be specified on each transaction that uses an activity throughout the system.  The defaulting could occur on GL20 instead of GL00 if the accounting unit used in the transaction would alter the way you would account for the project/activity posting.

To create an income statement for some accounting units, create an Accounting Unit list on MX10 that includes the companies and accounting units that you want to include in the report. Specify that list when running the GL293 and presto – an Income Statement with just those company-accounting unit combinations desired.

This can also be done by creating company groups if the “bursting” should be done by different companies.

A Level group can also be used which is great if one of your Accounting Unit levels signifies a reporting level.  As with the MX10 list, using RW70 level groups can also create a “bursting” effect to the GL293 Income Statement report.

Of course, creating different RW100 reports can also create the same effect and can be run together as a whole on RW100 by specifying a Report Group for example.  Use the same Format for all of the reports, and create different Row definitions that allow you to “burst” the income statement into different accounting groups.