Summary List Placement
For years, the 3D printing company MakerBot hosted all its software and data through Amazon’s cloud platform, AWS. But in 2018, the firm started seeking out alternatives in order to save money, reduce latency, and make life easier for its small engineering team.
MakerBot ultimately hired SADA Systems to help it make the migration to Google Cloud, since the services firm offers its clients support engineers, customer success managers, and more.
MakerBot started planning the move in February 2019 and by that summer it had set up teams to manage the process. It started preparing its production environment in September and by the end of that year its traffic was served entirely through Google Cloud.
Ultimately, MakerBot estimates that it cut its monthly costs by about 30% by switching from Amazon to Google, MakerBot’s lead cloud software architect Erik Ahrend told Insider (the firm to share information about its overall spend).
The switch saved its engineers time, too:
MakerBot previously self-managed its use of open source cloud computing project Kubernetes on AWS’s Elastic Compute Cloud, which required significant effort from its small team. It now uses a managed service so its developers can spend less time worrying while Google Cloud conducts most of the operational work.
Migrating cloud databases can be a challenging process
Throughout the process, Google Cloud partner SADA played a big role, Ahrend said, especially when MakerBot moved data from Amazon’s Aurora databases to Google’s BigQuery.
“I think we’d still be struggling right now if it wasn’t for SADA,” Ahrend told Insider. “Database is definitely the largest hiccup. SADA was there for us the entire time.”
MakerBot also isn’t completely off AWS: It still uses storage service Amazon S3 for some of its data.
“There are some masses we can’t move out of AWS because the cost to move them would be too large and the amount of development work is too large,” Ahrend said.
MakerBot’s entire database migration took somewhere between four and six hours, it estimates, with about four hours of downtime total.
Moving databases was a challenging process in part because of incompatibility between Aurora and BigQuery: While the tools work similarly, there is still a learning curve, Campbell said.
“When you look at the different clouds, they have similar services but the ways you operate them are unique to each of the clouds,” Campbell said. “One of the things I talk about with moving to GCP, it’s like knowing Italian and Spanish.”
“I think that’s a testament to what Google is doing in the cloud world and their competition with Microsoft and Amazon,” Campbell said. “We’re seeing GCP continue to gain traction in the market as well.”
NOW WATCH: Why electric planes haven’t taken off yet