Select Page
The Unique Family Dynamics of a Successful ERP Implementation

The Unique Family Dynamics of a Successful ERP Implementation

Tolstoy famously remarked that “all happy families are alike; each unhappy family is unhappy in its own way.”  Reflecting on Tolstoy’s own relations and on the kindred lives of the characters in his novels, I’ve often wondered if Enterprise Resource Planning (ERP) implementations are like families, and whether such categorical statements could be similarly applied to successful and unsuccessful families of projects.  While every project has its own unique dynamics, I’m obliged to believe that roughly the inverse of Tolstoy’s statement is the case—that each happy ERP implementation isn’t alike, but rather is successful in its own way.

 

That is, I’ve seen successful ERP implementation projects that have differed from one another in surprisingly significant ways.  As such, it might be best to review successful ERP projects individually and try to understand what it is among them that made them successful.  Anyone can wax eloquent on the generic platitudes that lead to a successful implementation, but in practice, when the time comes to make tradeoffs between platitudes, it’s helpful to know how companies work through challenges and finally arrive at successful implementations.

 

One project that we recently completed fit such a mold.  While not free of obstacles, the end-product was immensely successful.  A number of key factors led to the ERP implementation’s success:

  • All of the team members were engaged and onboard.  Getting the team to buy into the project’s mission, and actively support that mission, was never a problem.
  • The project team did a large amount of their own end-to-end testing.  Unlike some projects, where the team only tests while the consultants are onsite, the team verified their system configuration and business processes whenever possible, leading to a rock-solid business process at cutover.
  • The team took ownership of issue resolution.  The team dug in, tried things out, and came to solutions.  This served to greatly shorten certain phases of the project.
  • The team made decisions quickly, collaboratively.  The project was rarely, if ever, waiting on a key decision, and nobody on the team could have been accused of analysis paralysis.
  • The team took responsibility for their roles and did the work on time, and on schedule.  Schedule attainment was a high priority, and the team put the necessary work in to make things happen.
  • The team displayed a culture of respect, staying respectful during difficult conversations and decisions.  The stresses involved in an ERP project can at times encourage dysfunctional or toxic behaviors, but this team treated each other with a high degree of respect, even when working through the toughest decisions.
  • The team’s project management was of the highest capabilities, displaying excellent collaboration and communication with the core team, and with the EstesGroup team as well.

The net result was a successful ERP implementation project on-time and on-budget, with the expected level of system capabilities.  The team experienced a clean and quiet cutover, and quickly stabilized.  Within a short time, the company had moved onto managing daily operations and planning for the future.

Every project has its wayward sheep, be they executive sponsorship, excessive customization, inadequate team investment, or challenges with data conversion.  No project ever checks all the happy boxes. 

 

But in spite of challenges, the best companies still manage to successfully implement their enterprise systems, keeping their team engaged, committed, and dependable—regardless of all the unique twists in their project’s DNA. 

 

Are you ready for your company to create its own exceptional implementation story? 

Come talk to us, and we’ll share some of the greatest success stories of ERP history—prosperous implementations similar in success, yet nuanced in achievement—stories that can inspire your own project to be a story with a happy ending.

The Company You Keep: Deploying Company-Specific Customizations in a Multi-Company Environment

The Company You Keep: Deploying Company-Specific Customizations in a Multi-Company Environment

Setup is crucial for a successful company-specific customization in an Epicor ERP multi-company environment.

Maintenance of the Epicor ERP menu in multi-company environments can be unintuitive, and many customers come to us looking to better understand its capabilities and its limitations.  One area of special frustration is the deployment of customizations in multi-company environments.  Deploying company-specific customizations—especially since ERP version 10.1.600—has been a point of confusion.  Fortunately, once the steps are understood, the act of getting your customizations to the menu becomes less ambiguous, even if a little cumbersome.

 

When creating customizations, you can develop one that is specific to a single company or one that applies to all companies.  Let’s assume you were making a customization of the Epicor ERP Part Maintenance form.  For instance, perhaps you thought the Part Maintenance form would look better with a big green spot:

Let’s also assume you wanted the big-green-spot version deployed only to the company you were working in (in the case below, this would be the “EPIC06” company).  As such, you’ve saved the customization by not setting the “All Companies” flag and allowing the company to remain the one you’ve been working in.  This creates what is called a “company-specific customization”:

Now, deploying this customization to the standard “System” menu is not possible—the customization is not available when you click the “Customization” drop-down in Menu Maintenance:

To deploy the customization to the currently company without affecting all companies, navigate to the “Actions” menu in Menu Maintenance and select “Copy to Current Company”:

When this is done, the application makes a copy of the system menu.  In doing so, the new menu carries over a number of values from the original menu: the Menu ID, the Name, the Security ID, the Parent Menu ID, and even the Order Sequence is carried over.  But a number of key fields change.  The duplicate menu is no longer a System Menu, as it now has a Module type of “UD.”  The menu no longer applies to All Companies, as the owning company is now the company in which it was deployed. 

Most importantly, the “Customization” drop-down now allows you to select the customization that you’ve created:

When deploying a company-specific customization in a multi-company environment, the above steps allow you to create a menu deployment that will replace the system-based menu deployment for the company in question.  To demonstrate this, log out of the application and back in. 

 

As you can see, only a single menu node for the Part form is displayed:

And when this single node is selected, the customized version of the Part form that was previously created (big green spot and all) is displayed:

If, for some reason, you need to revert to the base “System” menu, you can always delete the duplicated menu from Menu Maintenance.

 

The ability to manage company-specific customizations in a multi-company environment is of great importance, especially in environments where companies have vastly differing business requirements and require highly-specific form deployments.  Such capabilities will keep your requirements aligned with your environments and keep you user base in good company.

Have any feedback or questions about custimizations? Let us know.

Historical Transactional Data Conversion: Prevent the Past from Haunting Your Present

Historical Transactional Data Conversion: Prevent the Past from Haunting Your Present

“I give [this watch] to you not that you may remember time, but that you might forget it now and then for a moment and not spend all your breath trying to conquer it. Because no battle is ever won he said. They are not even fought. The field only reveals to man his own folly and despair, and victory is an illusion of philosophers and fools.” – William Faulkner, The Sound and the Fury

 

Living and writing at a time where the legends of the Civil War and the fables of the Reconstruction were still part of living memory, William Faulkner’s work wrestled with the ideas of history, memory, mythology and heritage, and the challenges of one’s immediate existence amid such a monumental backdrop. 

 

The notion of contending with one’s history extends far beyond the literary world, and takes a place of eminence within the world of enterprise system implementation.  Almost without exception, I find myself with customers at the onset of Epicor implementation projects working through options regarding how best to address the management of the historical transactional data from their legacy system.  Customers find themselves in a paradoxical situation: customers want to move forward with a new system that can meet their upcoming strategic goals and initiatives, but they also want to be able to reference the rich history that was built up as part of their legacy system.  It is as if they wish to rip all the lathe, plasterwork, and wainscoting off their old home and slap it onto the walls of their new dwelling.  But as any carpenter would attest, the fit of old materials is never perfect.  Often, is it downright shoddy. 

Given the challenges of layering the old with the new, I often work with customers to help them understand the perils and promises of their data desires.  I try to find a way to satisfy the company’s history without compromising the new installation in question.  The fundamental challenge has to do with whether a specific company can bring over historical transactional data from their legacy system.  Transactional data may include Quotes, Sales Orders, Purchase Orders, Work Orders, Shipments, Receipts, Invoices, etc.  In going over the options, there are normally three ways in which a company can reference its historical data as part of a new Epicor implementation:

 

  • The transactional data can be converted using Epicor’s Data Management Tool (DMT).  That is, the legacy data can be manipulated to fit into a format conducive to the setup of the Epicor environment and loaded into the database, as if it were a live transaction load.
  • The transactional data can be loaded into Epicor user-defined (UD) tables and references from within the application using BAQs and Dashboards.
  • The legacy database can be connected to the Epicor application using external data source configurations, and then to queries using external BAQs.

 

The primary concern has to do with addressing the desire to actually convert legacy transactional data into Epicor’s database without compromising the historicity of the data itself, and without gumming up the system with a lot of noise.  It’s hard enough to correctly convert live records, much less to convert the mountains of ancient data.  From my own experience and from the experiences of my coworkers, we do not consider it a recommended best practice to convert historical transactional data into Epicor’s standard table structure when implementing Epicor ERP.   The following reasons underlay this recommendation:

  • When implementing or reimplementing, it is often the case that the base setup data values change, in terms of their naming conventions, quantity, values, etc.  This can make the transformation of legacy data labor-intensive and error-prone. 
  • ERP data is by its nature integrated.  Loading transactional records can trigger unexpected effects.  For instance, should an Epicor customer elect to import the Purchase Orders from the legacy environment, the system would plan for these POs to be received and alter system planning accordingly.  To avoid this, these records would need to be closed, but the data in question would therefore fail to reflect the original legacy data, in which Purchase Orders were closed through the receiving/invoicing process at a much earlier date. 
  • To load transactional data in the form in which it exists in the legacy environment, the entire collection of related records needs to be loaded.  For instance, if a customer wished to have purchase order history, to have representative data in the Epicor environment, the related PO Receipt and AP Invoice records would also need to be included.  This is, among other things, a tremendous amount of work. 
  • In some cases, Epicor’s business logic updates fields and does not allow these to be user-updated.  For instance, Sales Order or Purchase Order lines are system-set and not user-modifiable.  This makes it difficult to load historical data that accurately reflects the legacy data: what was PO Line 2 in the legacy system may get converted to PO Line 1, as it was the first line to be loaded.
  • Transactions have general ledger implications.  Loading transactions can thus have unexpected consequences that affect WIP, inventory, and their related GL accounts.  For instance, should the user import Purchase Order Receipt records, the transaction date would not by default reflect the actual date in which the transaction occurred.  Should the customer wish to date the transactions into the past, the necessary fiscal periods would need to be created.  If the load does not include all transactions, the General Ledger would be inaccurate, and would require effort to reconcile and correct. 

 

An alternative to performing a DMT table load into the standard tables is the utilization of Epicor UD tables to store this data.  This avoids many of the perils above, but requires the implementing customer to construct querying tools (BAQs and Dashboards) to retrieve and present this information.  Moreover, I’ve found that the actual usage of historical data tends to be less than anticipated.  In many cases, we have found that the actual usage of this data tends to decrease significantly shortly after cutover.  I recently had one customer go to great lengths to convert data to UD tables and construct the necessary querying tools to view the data—only to forget almost entirely about these capabilities amidst the booming buzzing confusion of cutover.  By the time they had settled down, they already had built enough living history into their database to move forward, and the UD tables were all but forgotten.

 

A final option is to access the legacy database using external data source connections and query it using external BAQs.  This option is normally the simplest of the three, as it saves the time of initially converting the data, though it similarly requires the construction of the necessary querying tools to receive and present the data.  Depending on the age and format of the legacy database, the external data source option might not be feasible.  As part of some implementations, I have seen customers convert their legacy databases into SQL format at cutover, for the express purpose of querying their ad hoc history via external BAQs.  In other cases, I’ve seen customers fall back to the UD table option when other options were seen as too labor-intensive.   

 

As such, given the other options, I recommend that the conversion of data at cutover be limited to master file and open transactions, and utilize external data source connections to the legacy database, and/or UD table conversion, in order to gain access to historical legacy data.

 

When it comes to the art of crafting prose, William Faulkner forgot more than most of us remember.  Elsewhere in his works, Faulkner wrote that “[t]he past is not dead. It’s not even past.”  In a similar vein, the attempt to bring old historical data into a new ERP system is itself an act of rewriting history.  And in doing so, one sacrifices accuracy for availability.  Also, when implementing a new system, a company quickly learns that the high cost of retaining history is something they would soon like to forget.  As a company builds a new history in its new system, the old history becomes less and less valuable, with such surprising rapidity that the old soon takes its place at the back of the database, taking up space and gathering data dust, like a childhood diary left on a top shelf and forgotten. 

 

Do you have questions about the Data Management Tool or best practices for data conversion? Or do you want to talk with the author about anything Epicor related? Let us know.

Cloud Computing Is Key To Recession Proofing Your Business

Cloud Computing Is Key To Recession Proofing Your Business

Recession-Proof Investments: Cloud Computing and the Refrigerator of the Future

We’ve long been told that necessity is the mother of invention, but when necessity is out of a sense of privation, reactions vary. Born and raised into the world of residential and commercial construction, I’ve felt the motion sickness that results from the ups and downs of the building cycle well into my earliest memories. And I‘ve been in business long enough to have stomached enough down-cycles to observe how different companies react to these changing economic climates.

 

As the current bull market gets slowly walked to the slaughterhouse, managers at all levels begin to wonder what it will mean for their own place within America’s larger business landscape. There is always the search for the investments that could be considered “recession-proof”—investments that will yield value during the current crisis, but would also serve as a foundation for future success. Simple cost-cutting is rarely such an investment—it yields short-term savings, but often at the expense of long-term objectives: I’ve never seen a hiker make it to the top of a peak faster by trimming down the soles of his boots.

 

That is, the most reactionary of companies looks to simple knee-jerk reactions to trim costs in order to get in line with shrinking revenues: eliminating optional programs or reducing essential services to their bare minimum. More innovative companies utilize this newfound necessity as a means of transforming their current state by getting ahead of the competition.

 

At the consumer level, the most well-known example of this phenomenon was the advent of the refrigerator. It was the Great Depression, of all things, that led to the broad use of the refrigerator by America’s large middle class. While it was, at the time, a significant capital expenditure for any given household, the refrigerator allowed families to save time and money: the ability to extend the life of the day’s victuals allows families to reduce waste, and thus cut costs, while also allowing them to expend the physical labor that would have been spent on the next meal on other activities. This rendered the old-fashioned ice box obsolete. In this way, times of downturn often have a way of surfacing new innovations—products that outpace their competitors suddenly emerge because the competitive landscape has reduced the viability of their less-innovative competitors.

 

Similarly, as America hit rock bottom in the late 2000s and early 2010s, cloud computing grew rapidly at the same time, as companies looked for ways to reduce cost and risk, while scaling up for the future. In this way, cloud computing may very well serve to become the coming recession’s refrigerator—the tool that will allow individuals and companies to strategically equip themselves not only for the hard times ahead, but also for the good times thereafter.

 

For Epicor customers, cloud computing surfaces as an opportunity to avoid the costs of replacing outdated hardware. Also, by moving installations into hosted environments, customers are able to eliminate the cooling costs required to keep their stacks on ice. Estes Group’s Epicor Cloud Managed Hosting offering (ECHO) is ready-by-design to protect and carry your company through the down-cycles and get you back into the saddle and riding the next bull market to better times. Looking to recession-proof your business? Please reach out to our team, and we’ll help you innovate a cloud computing strategy that will keep you ahead of the storm.

 

Are you looking for cloud computing options, or have questions on how we can help make your systems more flexible? Contact Us today or let us know below.

Data Center Location is Critical to Your Company’s Success and Survival

Data Center Location is Critical to Your Company’s Success and Survival

Looking California When You’re Feeling Minnesota: Where is the Best Data Center Location?

 

For manufacturing companies, the advent of “cloud computing” has raised a lot of questions.  Luckily, you don’t have to wander lonely as a cloud to find answers to your questions surrounding cloud solutions for your business.  Not as complicated as a cumulonimbus or as feathery as a cirrus, a cloud in the field of technology is as simple, or as complicated, as someone else’s computer.  But of the many questions a manufacturer may have, one frequently surfaces in relation to the location of the data: “So where is my data located, anyway?”

 

This isn’t a small squall of a question: if you are looking for an on-premise installation or a server stack in the cloud, your primary and secondary data centers’ location is a decision of atmospheric proportions—one with direct business impact.  

 

Whether choosing hosted or cloud solutions, your data center location is critical.  You must be wary of where exactly your data center servers are located, for all clouds are not created equal.  Downtime is the great fear when it comes to all things computing, and is often the result of natural disasters—and do you remember how long it took to get the power grid functioning in Puerto Rico after hurricane Maria?  Clearly, minimizing the risks of mother nature is a central concern.  Let’s take a down-to-earth look at some of the natural dangers facing your company’s data.

 

Earthquakes

 

When I worked in Arkansas a number of years ago, in an area that was on the edge of the New Madrid seismic zone, I noticed the strange cross-bracing in one of the factories, and I asked a local about it.  He explained the seismic risks in the area, and recounted the family lore about the quake of 1812.  Then he looked me square in the eye and said, “Whatever you do, don’t blame Arkansas—it wasn’t our fault.” 

 

It can be a surprise to discover that one the largest earthquakes in North America’s recorded history was not along the California coast but was actually along the New Madrid seismic zone in Missouri—of all places!  This was the quake that briefly caused the Mississippi River to run upstream back in 1812, the year almost exclusively famous for the conflict between America and England.  But while the Americans were locked in battle with the British on the East Coast, they were unwittingly losing the war with nature in the Midwest.   

 

This might serve as a warning if you locate your data center in a seismic zone—if your server gets death-rattled into oblivion, it’ll be your own fault.

 

 

Tornadoes

 

Nothing can lay your blades out like a deck of 52 quite like a tornado.  Tornadoes pry open buildings like nature’s proverbial can opener, allowing copious rain and debris to decorate your server room like a third grade art project, and you don’t want to see your data garnished with nature’s glitter.  Tornadoes pose a risk not only to your data center itself, but they also tend to knock out your primary—and even your secondary power supplies.  Backup generators are often located adjacent to a building, making them a potential target for mother nature’s twisted wrath.  So while a twister might leave a building unscathed, it might take out your external generator, rendering backup power systems useless.  Of course, that’s a moot point if the contents of your data center are laid out across the lawn like your laundry, for all to see.  Luckily, a proper data center location can help you avoid an unfortunate game of 52-pickup.

 

 

Floods

 

I reached out to one of my customers after a series of tornadoes ripped through Oklahoma, and he gave the all-clear: “The twisters missed us, but the water levels are so high, some folks can’t get into work.”  That is to say, a natural disaster can be more sneaky than a weather channel headline.  While things like tornadoes get a lot of attention, water levels can do a lot more damage over time.  As such, one might think twice about locating a data center on a floodplain.  While all my gamer buddies are hyped over water-cooled CPUs, I don’t quite think this is what they’re referring to. 

 

 

Hurricanes

 

Hurricanes amount to the worst of wind and water, with the ability to pummel your data center into paste from above, or dissolve it into a silicon solution from below.  And while the zone immediately affected by hurricanes is rather small, the extended zone where hurricane-related storms transform into inland berserkers is much larger.  Locating your stacks in a place that is far-removed from the hurricane fallout zone will serve you well in reducing wind and water risks. 

 

 

Heat

 

Another sneaky disaster when it comes to all things electronic is heat.  Not too long ago, I was in Charlotte, NC with a coworker.  One morning after breakfast, we were about to head to the customer site when my coworker ran back into the hotel to retrieve his coffee mug, leaving me in the parking lot.  I stood out in the morning heat for maybe a minute or two.  Now, being a Canadian, I generally overheat reading the newspaper, and the morning temp in Charlotte was obliterating.  By the time we got to carpooling, I was already a puddle.  And this was still in the early morning!  Servers are like Canadian consultants—they work better in temperate climates.  When choosing a shack to hang your racks, look to locate it in a place where your cooling systems won’t be fighting a losing battle with the heat index.  Servers generate enough heat on their own—they don’t need any help! 

 

The Cloud

 

While the notion of “The Cloud” brings with it visions of the ethereal, it is in reality quite terrestrial in nature.  Hosting a customer’s ERP system is a huge responsibility, and not one to be taken lightly.  The cloud itself can be just as risky as a hurricane.  As such, the EstesGroup is all about maximizing service while minimizing risk.  In support of our Epicor Hosting initiative, we keep our data center located in Michigan, which has a favorable climate for keeping servers cool as a cucumber, while avoiding the many environmental pitfalls noted above.  Moreover, by having our data center location in the Midwest, we provide centrality that allows us to rapidly service a broad region.  With optimal location and cloud infrastructure, the team at EstesGroup can serve your business needs by providing ideal solutions for your data, regardless of the weather. 

 

If you find yourself looking to the sky for answers to your worldly business questions, please give our team a call.