Techno Blender
Digitally Yours.

Migrating data? Make the most of the Product itself! | by Alle Sravani | Medium

0 38


In this article, I would like to share some key learnings from a data migration project I had worked on in the past. My company was migrating their internal CRM system to Salesforce and I was tasked to facilitate the data migration process. I had numerous obstacles along the way, but I was able to overcome them all. I’ve included the best practices that helped me with my project below. If you are currently working on or planning a data migration project, this article is for you. If you are completely new to the issue, I have attempted to keep the language simple so that you can follow along and get a general understanding of how data migration projects function. Despite the fact that I’m using examples that are more relevant to Salesforce, you might find something valuable for the specific product you’re working with.

Photo by Kevin Ku on Unsplash

Acquaint yourself with the product

I already knew the internal CRM system (let’s call it the legacy system) and the data architecture behind it, so all I needed to do was get comfortable with Salesforce. I began by exploring Trailhead, Salesforce’s free online learning portal. They also offer a free developer login account (sign up here) to allow you to explore their product and its features. I used this account until I got access to the platform by my employer. The initial investigation enabled me in discovering all of the potential product features that I could later exploit.

Create your testing environment

If possible, set up a separate testing environment for yourself (and your team) to test the data pipelines/connections you’re building. If the product does not allow you to create a separate environment, you must establish some testing standards and techniques. All testing data should be easily identifiable and deleted so that the production environment remains clean. You may not be the only one working on the project, and you certainly don’t want someone else’s activities to have an impact on your data.

Fortunately, Salesforce provides the option to create such environments (called as Sandbox). Typically, a company can maintain one sandbox for UAT testing, one as a backup recovery and one or more for development purposes. A professional edition of Salesforce gives you 10 Developer Sandboxes, which I think is quite cool. I conveniently tested all the API connections I had built, the data pipelines and some other codes I had written. I could send, modify and delete all test records without worrying too much.

Design the architecture

Every business is unique, and no single product will fit perfectly. There will always be special adjustments you will have to make so that the data meets the business logic. You must find the best data structure for each table. This is also an excellent moment to review the current data architecture and identify any shortcomings that could be rectified in the new system. Salesforce allows customers to create custom objects (tables) and fields (columns) to store company-specific data. You can create up to 50 custom objects and 100 custom fields per object in the professional edition. The best thing is these custom objects and fields have the same capabilities as the built-in objects and fields.

For example, I was able to combine three tables into one and differentiate them by just adding one additional field describing the type of record. Leveraging the parent-child relationship, I was able to easily set up aggregated fields on the parent object which would adjust automatically when the data on child items was updated. I set up validation rules on the fields so that the system will do the quality checks on data before storing them. I also modified the page layouts to adjust what data is shown to the end-users. I made several such tiny improvements, which resulted in a clean data architecture and decreased the overall system’s complexity.

Automate your processes

There are two types of automation I’m talking about here:

  • Automate the business process
  • Build data pipelines to migrate data

Automating the business processes comes first. This can reduce the manual efforts of end-users and errors. From the data point of view, it also means you have slightly less data to import in. For example, the sales team team would create an “opportunity” for a customer. If it reaches the quotation stage, they would add in the products and other attributes to generate a quotation. If deal goes through, then using the signed contract (created from quotation), the sales admin would then create an “order” in the system with the same products. In the past, the sales and CRM systems were different and thus the double entry was unavoidable. But with Salesforce, I was able to easily automate the order creation process using the built-in “Flow” feature which would directly fetch all the data from the opportunity itself. As a result, for new opportunities which were still in processing stages on Salesforce, orders were directly created later on Salesforce itself and not in the old system.

The second part is to automate the data ingestion process. Ideally, the data should be migrated only once. Once the migration is done, then the users should only use the new system. Sadly, this was not the case for my organisation. There were certain intricate procedures and edge situations that did not fit in Salesforce, and we had to wait for the developers to build a custom solution for those. On the other hand, the management wanted the users to begin working on Salesforce, and thus both systems were utilised concurrently for a time. For the old data, updates were only made in the legacy system. For me it was a mess because it was hard to keep track of what changed and I had to update the imported data multiple times. The data structures in the two systems differed, and some pre-processing was required before I could import into Salesforce. I saved a lot of time and efforts by writing codes for them to ensure the data in both systems was up to date.

Keep an eye out for Data Governance

Because you are transitioning to a new system, you must exercise extreme caution before granting access to end users. There may be no data governance structure in place. By default, everyone can see everything, which may be quite dangerous. You must define roles, who has access to what, and the amount of access that you wish to grant. Salesforce has a robust framework in place to manage such roles and responsibilities. Hierarchies can be set up, and read/write/view or no access can be controlled down to the field level while permission sets can be configured to handle special scenarios. Whenever someone quits the company, it is quite simple to deactivate their account and transfer all of their records to someone else. It’s a good idea to have at least one user set up as a test account so you can test the data access and permissions provided to each role.

Photo by Melanie Deziel on Unsplash

Maintain reports

Numbers say volumes. Create reports on the migration tasks completed and, if possible, put them all on a dashboard to track your progress. Maintain such reports in Excel if it is not feasible to do so on the product itself. This could serve as a way to track your progress. Salesforce offers an excellent reporting tool. Their dashboards are quite restricted in comparison to standard BI tools, but they are still something. I set up a separate dashboard for data quality for this project. I created multiple reports that flagged missing data and discrepancies and placed them all on one dashboard. This dashboard was shared with all users, and they could only see their records that had data issues. This made it easier to fix the problems and increased the data quality over time.

Photo by Towfiqu barbhuiya on Unsplash

There are numerous additional capabilities that I also exploited such as Apex codes, approval process, email services, visualforce pages, chatter, etc but they were more on development side and less on data. I don’t want to go into every element of Salesforce that impressed me, so I’ll stop here. Being a data person, I must admit that their product impressed me. I hope you found this article useful. Do check out my previous article where I talked about the challenges I encountered when creating a Tableau dashboard using Big Data.


In this article, I would like to share some key learnings from a data migration project I had worked on in the past. My company was migrating their internal CRM system to Salesforce and I was tasked to facilitate the data migration process. I had numerous obstacles along the way, but I was able to overcome them all. I’ve included the best practices that helped me with my project below. If you are currently working on or planning a data migration project, this article is for you. If you are completely new to the issue, I have attempted to keep the language simple so that you can follow along and get a general understanding of how data migration projects function. Despite the fact that I’m using examples that are more relevant to Salesforce, you might find something valuable for the specific product you’re working with.

Photo by Kevin Ku on Unsplash

Acquaint yourself with the product

I already knew the internal CRM system (let’s call it the legacy system) and the data architecture behind it, so all I needed to do was get comfortable with Salesforce. I began by exploring Trailhead, Salesforce’s free online learning portal. They also offer a free developer login account (sign up here) to allow you to explore their product and its features. I used this account until I got access to the platform by my employer. The initial investigation enabled me in discovering all of the potential product features that I could later exploit.

Create your testing environment

If possible, set up a separate testing environment for yourself (and your team) to test the data pipelines/connections you’re building. If the product does not allow you to create a separate environment, you must establish some testing standards and techniques. All testing data should be easily identifiable and deleted so that the production environment remains clean. You may not be the only one working on the project, and you certainly don’t want someone else’s activities to have an impact on your data.

Fortunately, Salesforce provides the option to create such environments (called as Sandbox). Typically, a company can maintain one sandbox for UAT testing, one as a backup recovery and one or more for development purposes. A professional edition of Salesforce gives you 10 Developer Sandboxes, which I think is quite cool. I conveniently tested all the API connections I had built, the data pipelines and some other codes I had written. I could send, modify and delete all test records without worrying too much.

Design the architecture

Every business is unique, and no single product will fit perfectly. There will always be special adjustments you will have to make so that the data meets the business logic. You must find the best data structure for each table. This is also an excellent moment to review the current data architecture and identify any shortcomings that could be rectified in the new system. Salesforce allows customers to create custom objects (tables) and fields (columns) to store company-specific data. You can create up to 50 custom objects and 100 custom fields per object in the professional edition. The best thing is these custom objects and fields have the same capabilities as the built-in objects and fields.

For example, I was able to combine three tables into one and differentiate them by just adding one additional field describing the type of record. Leveraging the parent-child relationship, I was able to easily set up aggregated fields on the parent object which would adjust automatically when the data on child items was updated. I set up validation rules on the fields so that the system will do the quality checks on data before storing them. I also modified the page layouts to adjust what data is shown to the end-users. I made several such tiny improvements, which resulted in a clean data architecture and decreased the overall system’s complexity.

Automate your processes

There are two types of automation I’m talking about here:

  • Automate the business process
  • Build data pipelines to migrate data

Automating the business processes comes first. This can reduce the manual efforts of end-users and errors. From the data point of view, it also means you have slightly less data to import in. For example, the sales team team would create an “opportunity” for a customer. If it reaches the quotation stage, they would add in the products and other attributes to generate a quotation. If deal goes through, then using the signed contract (created from quotation), the sales admin would then create an “order” in the system with the same products. In the past, the sales and CRM systems were different and thus the double entry was unavoidable. But with Salesforce, I was able to easily automate the order creation process using the built-in “Flow” feature which would directly fetch all the data from the opportunity itself. As a result, for new opportunities which were still in processing stages on Salesforce, orders were directly created later on Salesforce itself and not in the old system.

The second part is to automate the data ingestion process. Ideally, the data should be migrated only once. Once the migration is done, then the users should only use the new system. Sadly, this was not the case for my organisation. There were certain intricate procedures and edge situations that did not fit in Salesforce, and we had to wait for the developers to build a custom solution for those. On the other hand, the management wanted the users to begin working on Salesforce, and thus both systems were utilised concurrently for a time. For the old data, updates were only made in the legacy system. For me it was a mess because it was hard to keep track of what changed and I had to update the imported data multiple times. The data structures in the two systems differed, and some pre-processing was required before I could import into Salesforce. I saved a lot of time and efforts by writing codes for them to ensure the data in both systems was up to date.

Keep an eye out for Data Governance

Because you are transitioning to a new system, you must exercise extreme caution before granting access to end users. There may be no data governance structure in place. By default, everyone can see everything, which may be quite dangerous. You must define roles, who has access to what, and the amount of access that you wish to grant. Salesforce has a robust framework in place to manage such roles and responsibilities. Hierarchies can be set up, and read/write/view or no access can be controlled down to the field level while permission sets can be configured to handle special scenarios. Whenever someone quits the company, it is quite simple to deactivate their account and transfer all of their records to someone else. It’s a good idea to have at least one user set up as a test account so you can test the data access and permissions provided to each role.

Photo by Melanie Deziel on Unsplash

Maintain reports

Numbers say volumes. Create reports on the migration tasks completed and, if possible, put them all on a dashboard to track your progress. Maintain such reports in Excel if it is not feasible to do so on the product itself. This could serve as a way to track your progress. Salesforce offers an excellent reporting tool. Their dashboards are quite restricted in comparison to standard BI tools, but they are still something. I set up a separate dashboard for data quality for this project. I created multiple reports that flagged missing data and discrepancies and placed them all on one dashboard. This dashboard was shared with all users, and they could only see their records that had data issues. This made it easier to fix the problems and increased the data quality over time.

Photo by Towfiqu barbhuiya on Unsplash

There are numerous additional capabilities that I also exploited such as Apex codes, approval process, email services, visualforce pages, chatter, etc but they were more on development side and less on data. I don’t want to go into every element of Salesforce that impressed me, so I’ll stop here. Being a data person, I must admit that their product impressed me. I hope you found this article useful. Do check out my previous article where I talked about the challenges I encountered when creating a Tableau dashboard using Big Data.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment