We’ve heard them before and they’re easy to find: quotes about the positive aspects of mistakes. I like this witty one from Eleanor Roosevelt:
“Learn from the mistakes of others. You can’t live long enough to make them all yourself.”
Surely, we learn more from our mistakes than from our successes.
We have seen many different blogs from colleagues in the PLM community about the situation with Swedish telecom giant Ericsson and its PLM implementation. (Oleg Shilovitsky: What to Learn from Ericsson PLM Failure, Jos Voskuil: The PLM Blame Game and Verdi Ogewell in Engineering.com: Telecom Giant Ericsson Halts Its PLM Project with Dassault’s 3DEXPERIENCE.
Here is our take on the situation.
About Ericsson and the 3DX Implementation
The company is the fifth largest software company in the world and is a leader in 5G technology. It “has a market presence in 180 countries, with 40 percent of the world’s mobile traffic running on systems supplied by Ericsson.” The organization has approximately 95,000 employees and 2018 revenues at around $24.2 billion.
In 2016, Ericsson selected Dassault Systèmes 3DEXPERIENCE Platform and decided to implement it via a so-called “big bang” implementation affecting approximately 25,000 users in R&D and other departments. The implementation was aggressively projected to take 14 months. The company planned to stop using its former IBM mainframe environment, PRIM, which used Dassault’s ENOVIA V5/Matrix 10 as its PDM system and to move over to Dassault’s 3DEXPERIENCE/V6 architecture with ENOVIA V6 as a configuration and database engine to replace PRIM.
What Went Wrong?
Before even looking at specific issues, we know that the PLM community will well understand that it’s highly challenging to replace an old system and introduce a new PLM solution in such a large organization and expect to do so in such a short period of time. Both the size of the implementation and the extremely short time frame hampered the implementation from Day One.
A Big Bang Implementation
What is a “big bang” implementation methodology and what is its alternative?
A big bang implementation suggests that the company purchase and deploy all available PLM features with full functionality. It is a turn-key approach, which means all existing PLM-related processes and data (often from different silos of data) are transitioned to a brand-new system, all at the same time, across the entire enterprise.
In this situation, the system is not turned on until there is a fully functional system with all legacy data in it. It has its pros and cons.
- The benefit of this approach is that when the company goes live, it is not dependent on the legacy systems or older processes – users can do all their work in the new system.
- The drawback of this approach comes from the fact that the company may be transforming multiple systems and migrating millions of records, requiring the proper amount of time and effort, with huge quantities of interrelated data that needs to be correctly configured.
Alternately, when using an incremental or phased approach for the implementation methodology, the company deploys the capabilities and features in stages over time. The implementation sequence can be based on a specific project, product scope, incremental number of users, company divisions, etc.
- The benefit of this approach comes from achieving quick wins and the ability to start using parts of the new PLM system without waiting for the full system to be up and running.
- The drawback of this approach results from the fact that in some situations, all systems may still need to be used and processes may need to be put in place to understand how the “new” and “old” systems interact. The best case is to take a process or area that is currently not being managed by a system and implement those processes in the new PLM system as a phase 1 step.
Migrating Legacy Data
Data migration from one system to another is a complex and demanding task that requires multiple rounds of testing, fixing, testing, re-running, testing, and testing again. It will never be “one and done.” The quality of existing data is always a stumbling point and requires extensive and detailed attention. The complexity amplifies tenfold when dealing with different silos of legacy data and systems.
As Verdi Ogewell writes: “According to my sources, this data may have become corrupt and distorted in part.” Sources within Ericsson reported, “Every time we have tested the migrated data, all use cases failed.” He also suggests that corrupted data may have disrupted the functionality of Ericsson’s Corporate Basic Standard (CBST), which is its description of products, but this has not been verified.
Cleaning and preparing data for migration is a time-consuming and highly detailed process that cannot be rushed. It is critical to allow sufficient time to work with the legacy data and test the migration until it works.
Issues can come up in a variety of areas when migrating data:
- The data itself is frequently corrupted with broken links, missing meta data, etc. When migrating from one system to the other, it is essential to include a project to clean the data.
- The migration tools or migration technology might have bugs or might not be configured correctly, which will cause failures while testing and validating the data.
- The system may not be configured properly to support the various user scenarios. Even with the best migration tools/processes, if the system is not configured to support the required user scenarios, just having the data in the system does not do any good.
The bottom line here is that all parties – the experts in legacy data, migration and systems/architecture – need to work together to perform a successful migration.
System Integration from the Old to the New
Ericsson employed as many as 400 to 500 downstream sub-systems that required integration with the new PLM platform. Initial planning apparently did not account for the sheer volume of sub-systems requiring integration. The lack of a full accounting of the real situation served as another stumbling block to a successful implementation.
In the manufacturing software world, enterprises are repeatedly advised to select technology from vendors with real-world vertical experience in the company’s industry. In the case of Ericsson and Dassault, the companies decided to jointly develop a telecom-specific solution within the framework of the 3DX platform. Dassault Systèmes presented a set of tools for creating a PLM infrastructure and software platform that would respond to the issues of the mobile telephony communications and network industry. In a 2016 interview with Engeering.com, Ericsson’s then CIO, Johan Torstensson said, “We liked Dassault’s roadmap and vision,” and added that these factors were the most important behind the choice. “We wanted to have a partner in developing the next generation telecom technology.”
Who’s at Fault?
A lot has been said about who is at fault in this business case. Verdi Ogewell evenly puts the blame on all players:
- Ericsson itself (the customer) for not having a single executive owner throughout the project, a lack of planning and initially pushing for a turn-key solution in a very short time frame. In fact, this demonstrated a lack of experience in enterprise systems implementation from such a large and reputable company.
- The vendor for obviously committing to such a demanding – and in hindsight we can say – impossible timetable. Any implementer with more than couple years’ experience in PLM migrations would have said it was impossible to accomplish the goals in such a short time period. Strategic planning was missing in this case.
In my personal view, a successful implementation involves three interrelated parts: the customer, the implementer and the software. In a successful project, this would be presented as an equilateral triangle. Once one of the vertices is moved, the equilibrium is skewed. I believe that Ericsson’s lack of a consistent project owner and poor planning, coupled with the vendor’s inability to push back and say “no” while explaining to Ericsson why the timeline was virtually impossible, clearly pushed this project to failure.
We can extract these lessons from the details of the Ericsson situation:
- Allow a reasonable amount of time for the implementation and deployment of a new PLM solution that relates to the size and scope of the project, especially when a complex data migration is involved.
- If a big bang methodology is the choice, make sure that enough time is allocated for thought processes and careful planning. Such planning should involve a core team of experts from all the divisions impacted by this process.
- Strongly consider a phased approach methodology, with small wins, rather than a big bang earthquake which could significantly impact manufacturing performance.
- Understand that PLM data migration is an intricate and complex process, requiring time to plan, clean up bad data, handle conversions, develop and configure tools, and test and evaluate the results before it can be called “successful.”
- Identify and understand all downstream sub-systems and business processes that require integration. Take all necessary steps to ensure interoperability. Examine current processes to identify where enhancements could deliver efficiency. The goal of PLM integration is to reduce redundancies and eliminate errors.
- Work with vendors with real-world industry experience.
- As a customer, ensure you have a consistent “ownership” of the implementation with C-level authority. It was clear in the Ericsson case that ownership kept changing hands, which, in my opinion was devastating to the project.
- As a vendor/implementer, learn to say no. We look at all implementation as partnerships with the end customer. Being honest and providing accurate information goes a long way, even when some of the advice is not what the customer wants to hear. Eagerness to win a project and committing to such aggressive and ultimately impossible migration timelines were clearly critical mistakes.
Are you looking for help with PLM implementation and data migration? Contact xLM Solutions.