« All Blogs

Person on ladder reaching up into the clouds

Lifion at ADP’s cloud transformation journey



Lifion’s Cloud Transformation Journey

On moving to managed services in a microservice architecture

Zaid Masud

Zaid Masud

Mar 26, 2019 · 5 min read

Since Lifion’s inception as ADP’s next-generation Human Capital Management (HCM) platform, we’ve made an effort to embrace relevant technology trends and advancements. From microservices and container orchestration frameworks to distributed databases, and everything in between, we’re continually exploring ways we can evolve our architecture. Our readiness to evaluate non-traditional, cutting edge technology has meant that some bets have stuck whereas others have pivoted.

One of our biggest pivots has been a shift from self-managed databases & streaming systems, running on cloud compute services (like Amazon EC2) and deployed with tools like Terraform and Ansible, towards fully cloud-managed services.

When we launched the effort to make this shift in early 2018, we began by executing a structured, planned initiative across an organization of 200+ engineers. After overcoming the initial inertia, the effort continued to gain momentum, eventually taking a life of its own, and finally becoming fully embedded in how our teams work.

Along the way, we’ve been thinking about what we can give back. For example, we’ve previously written about a node.js client for AWS Kinesis that we’re working on as an open source initiative.

AWS’s re:Invent conference is perhaps the largest global cloud community conference in the world. In late 2018, we presented our cloud transformation journey at re:Invent. As you can see in the recording, we described our journey and key learnings in adopting specific AWS managed services.

In this post, we discuss key factors that made the initiative successful, its benefits in our microservice architecture, and how managed services helped us shift our teams’ focus to our core product while improving overall reliability.

Why Services Don’t Share Databases

The notion of services sharing databases, making direct connections to the same database system and being dependent on shared schemas, is a recognized micro-service anti-pattern. With shared databases, changes in the underlying database (including schemas, scaling operations such as sharding, or even migrating to a better database) become very difficult with coordination required between multiple service teams and releases.

As Amazon.com CTO Werner Vogels writes in his blog:

Each service encapsulates its own data and presents a hardened API for others to use. Most importantly, direct database access to the data from outside its respective service is not allowed. This architectural pattern was a response to the scaling challenges that had challenged Amazon.com through its first 5 years…

And Martin Fowler on integration databases:

On the whole integration databases lead to serious problems becaue [sic] the database becomes a point of coupling between the applications that access it. This is usually a deep coupling that significantly increases the risk involved in changing those applications and making it harder to evolve them. As a result most software architects that I respect take the view that integration databases should be avoided.

The Right Tool for the Job

Applying the database per service principal means that, in practice, service teams have significant autonomy in selecting the right database technologies for their purposes. Among other factors, their data modeling, query flexibility, consistency, latency, and throughput requirements will dictate technologies that work best for them.

Up to this point, all is well — every service has isolated its data. However, when architecting a product with double digit domains, several important database infrastructure decisions need to be made:

  • Shared vs dedicated clusters: Should services share database clusters with logically isolated namespaces (like logical databases in MySQL), or should each have its own expensive cluster with dedicated resources?
  • Ownership: What level of ownership does a service team take for the deployment, monitoring, reliability, and maintenance of their infrastructure?
  • Consolidation: Is there an agreed set of technologies that teams can pick from, is there a process for introducing something new, or can a team pick anything they like?

From Self-Managed to Fully Managed Services

When we first started building out our services, we had a sprawl of supporting databases, streaming, and queuing systems. Each of these technologies was deployed on AWS EC2, and we were responsible for the full scope of managing this infrastructure: from the OS level, to topology design, configuration, upgrades and backups.

It didn’t take us long to realize how much time we were spending on managing all of this infrastructure. When we made the bet on managed services, several of the decisions we’d been struggling with started falling into place:

  • Shared vs dedicated clusters: Dedicated clusters for services, clearly preferable from a reliability and availability perspective, became easier to deploy and maintain. Offerings like SQS, DynamoDB, and Kinesis with no nodes or clusters to manage removed the concern altogether.
  • Ownership: Infrastructure simplification meant that service teams were able to develop further insight into their production usages, and take greater responsibility for their infrastructure.
  • Consolidation: We were now working with a major cloud provider’s service offerings, and found that there was enough breadth to span our use cases.

Evolutionary Architecture

On our Lifion engineering blog, we’ve previously written about our Lifion Developer Platform Credos. One of these speaks to the evolutionary nature of our work:

  • Build to evolve: We design our domains and services fully expecting that they will evolve over time.
  • Backwards compatible, versioned: Instead of big bang releases, we use versions or feature flags letting service teams deploy at any time without coordinating dependencies.
  • Managed deprecations: When deprecating APIs or features, we carefully plan the impact and ensure that consumer impact is minimal.

When we started adopting managed services, we went for drop-in replacements first (for example, Aurora MySQL is wire compatible with the previous MySQL cluster we were using). This approach helped us to get some early momentum while uncovering dimensions like authentication, monitoring, and discoverability that would help us later.

Our evolutionary architecture credo helped to ensure that the transition would be smooth for our services and our customers. Each deployment was done as a fully online operation, without customer impact. We recognize that we will undergo more evolutions, for which we intend to follow the same principles.

« All Blogs

Illustration of human arm and android arm with humans shaking hands with androids

C-suite execs give future technology predictions for the decade


What will keep CIOs busy this decade? From machine learning to cybersecurity, IT leaders are providing insights on technologies they predict will be accelerated over the next 10 years.

It’s difficult to predict how technology could influence businesses months into the future, not that such lack of clarity stops analysts and experts from annually making predictions about the next year. Indeed, the prognostications certainly proliferated as 2020 began. It was a new year and it was time, yet again, to look at how AI, IoT, cybersecurity and a wide range of other technology developments could either boost or derail the modern enterprise.

Yet, even though technology rapidly advances, trying to discern a year in advance could be short-sighted. It’s worth looking even further ahead into the future technology predictions to better prepare organizations for the long-lasting consequences of planning new tech initiatives and managing old ones.

A handful of C-suite executives offered their insights on technology predictions and what might occur in the entire decade, from familiar topics such as the dangers of machine learning and cybercrime to issues not generating as much debate.

Technology won’t replace humans

The debate over whether technology will replace human labor is as old as the Industrial Revolution. Many IT leaders acknowledge that some developments including AI, robotic process automation and autonomous cars will replace human workers, but several others contend these technologies will create different jobs that will need to be filled by humans.

Vipul Nagrath, global CIO at ADP, is among those who believe there will always be a place for many human laborers. “As there is a shifting of work, you will always need people to write the programs, the algorithms,” he said.

Vipul NagrathVipul Nagrath

And new ideas can lead to new labor outputs. For instance, “the smartphone has created more industries than it has taken away,” Nagrath said. Even though people now rely on smartphone map applications to get from A to B — diminishing the need for paper maps — GPS-supported maps have created many new business opportunities within digital applications, he said.

Unchecked AI could cause moral problems

Just as the idea of robots taking over human jobs causes consternation, unrestrained AI similarly stirs fear. Human error or bad intent in the programming of machine learning systems has sparked predictions that AI could run amok. Several tech companies have pushed for regulations around AI, with the aim of addressing questions about potential biases and lack of accuracy.

James CarderJames Carder

James Carder, CSO and vice president at the security firm LogRhythm Labs, went as far as predicting that manipulated AI could wrongly put people in prison this decade. Despite warnings of human biases in the training of AI, the legal system relies on the technology to handle voluminous casework, leaving the door open to the sort of exploitation that could implicate someone in a crime, according to Carder. “You can trick AI if you inject a bunch of bad data and say, ‘This looks good.’ You can take that data and say it’s trying to achieve X or Y,” he said. But in reality, its results lean toward a predetermined conclusion.

At ADP, employees are already mindful of the dangers of biases and other misuses of AI, Nagrath said. The company favors explainable AI in its payroll technology innovations, a process that allows people to inspect and review the genesis of data to see how the AI-driven results were achieved, according to Nagrath. Such vouchsafing will matter if AI is increasingly determining financial results such as paychecks. Employees will want to know the reasoning behind compensation in a paycheck, compelling ADP and its clients to explain in real time how those figures were calculated, he said.

Anurag KaholAnurag Kahol

The U.S. gets its own privacy law

The EU famously has GDPR, codified standards for how companies that conduct business with EU citizens have to manage and safeguard those customers’ data, but the U.S. does not have such a blanket law. That could change in 2020, when a similar law will at least be considered by Congress, said Anurag Kahol, CTO and co-founder of the cloud security company Bitglass.

A consumer privacy act went into effect in California this year, and the New York legislature is currently considering one. But a comprehensive federal law is “needed to avoid a patchwork of differing data privacy laws from each state, to facilitate more nationwide business and to enable international commerce,” Kahol said. “Facing numerous regulations can be a barrier that keeps foreign businesses from entering a market.”

5G might or might not be revolutionary

For several years now, 5G technology has been touted as the next best thing. It is expected to have lower latency than previous cellular technologies, thus improving computing power for technologies on the edge, as well as those connected to IoT and other digital services. But 5G was a technology prediction that is now rolling out slowly, with telecommunications companies building infrastructures slower than the hype estimated. It’s yet unknown just how potent 5G will be, leading companies to hold their breath and not make any big investments for the time being.

Nagrath said ADP’s mobile-first focus will continue, regardless of the speed of the 5G rollout. “We’ll get the capabilities to do things with mobile that we don’t have the capabilities to do today, and that’s true for many players in the industry,” he said. Still, ADP can’t plan around 5G just yet. Innovation will have to continue with the tools being offered today. Nagrath joked that his day-to-day work continues whether he is relying on 5G or a Wi-Fi router.

Chris DeRamusChris DeRamus

Cybersecurity will influence mergers and acquisitions

Chris DeRamus, co-founder and CTO at the cybersecurity company DivvyCloud, predicted that companies will be more guarded as they move to acquire other businesses or complete mergers. He pointed to the breach of 339 million customer data records from the hotel chain Starwood, which the company announced shortly after it was acquired by the larger chain Marriott.

“Organizations will place cloud security at the forefront of the [mergers and acquisitions] process,” DeRamus said, adding that they will do so by conducting thorough audits of the cloud infrastructures of the companies they are merging with or acquiring.

There’s promise, but caution, with biometrics and IAM

DeRamus said he believes it’s a sound approach if organizations try to protect assets through identity access management (IAM), but still warned that it’s more difficult to manage than it seems. “In this new decade, security professionals are going to realize that IAM is an area where they can lose control rapidly, and it is very hard to take back,” he said. Whereas IT could previously audit the digital footprints of 1,000 employees connected to mainframe servers, security pros now have to stay on top of thousands of cloud gateways.

“Approaches and strategies from the data center world don’t transfer,” DeRamus said. He recommended establishing strong cloud account structures, “starting with curbing what people have access to.” That means dividing users into groups for specific resources.

Biometric authentication has become a popular access tool, but Carder warned that although it seems airtight today, in the future it could pose an even greater issue than a lost password. “There are going to be some unlucky people whose biometric information is stolen and used for repeated fraud,” he said. “If your credit card details are stolen, you can easily change your account number. But what if [the image] of your face gets stolen? Once that information is compromised, there’s no swapping it out.”

He added that the consequences of hacked biometric data could pose a particularly big problem in healthcare, making tighter security and regulations around access a pressing matter that should be addressed now.

Research will go a long way in the 2020s

Alex Holden, a security expert for the nonprofit tech association ISACA and CISO of the consulting firm Hold Security, pointed to how each decade has had its own tech theme.

Alex HoldenAlex Holden

“In the 1990s, we were fighting to keep our systems from crashing. In the 2000s, we worked on stability and functionality. In the 2010s, we fought for availability and against breaches. Now, in the 2020s, it is a race against time,” he said.

That race is against cybersecurity threats that are coming more effectively and at a faster pace in industries that are short on security experts but long on cybersecurity products, according to Holden. Facing those kinds of odds, he recommended that organizations do more than rely on security products and the advice of those vendors. “You can’t buy something off the shelf and believe you’re now secure,” he said. “And you don’t want to get threat information only from vendors who have a solution — they’re more than likely to sell something that can fix a problem but not anything beyond that.”

Instead, the 2020s call for self-education and self-awareness on cybersecurity and other tech issues, Holden said. C-suite executives should read daily technology security briefings from trusted technology pundits and immerse themselves in objective research that covers how their industries are handling tech. According to Holden, it’s a high-stakes battle for which there is a military analogy. “If you are a general, you have to get the state of the troops today; it can’t be yesterday,” he said. “Things change fast.”


« All Blogs

Image depicting artificial intelligence

ADP’s Martha Bird on the ground at CES 2020 to talk about NLP/ML and AI, plus the top 5 themes for 2020


ADP’s business anthropologist, Martha Bird, reports on the top five themes at the 2020 Consumer Electronics Show that are important for today’s industry leaders.

With over 4,000 exhibiting companies, 2.9 million square feet of exhibit space, attracting more than 180,000 attendees and 307 Fortune 500 companies, there was a lot to take in at CES 2020 in Las Vegas. Some of the most innovative technologies to come included a flying taxi (Hyundai), electric multi-modal transportation, electric vertical take-off and landing craft (Uber), cool and creepy robotics, green and sustainability tech, 8K bezel-less TVs (Samsung), AI attended drive thru (McDonald’s lab), 150 digital health exhibitors and so much more. Within this tech frenzy, it was my great pleasure to represent ADP on stage and in studio where I discussed how natural language processing, machine learning and artificial intelligence (NLP/ML and AI), in general, is impacting the workplace – the tools, the processes and the people.

While it was impossible to see everything given the sheer magnitude of the event, there are some high-level reflections on what I consider to be the pervasive themes from this year’s event that industry leaders should keep their eyes and ears open for moving into 2020. These are my top five:

1. 5G: Data, data, and more data

On the CES floor, data was the common denominator across products and services on display and those demoed. Given the explosion of data contingent technologies, online privacy and security was a central talking point. How different regions address security concerns around data and privacy was less explicitly articulated although a continuum of highly private to blatantly public could be surmised. Along with a definite trend toward the true consumerization of AI.

Which brings me to 5G. In the next two to three years, networks will expand out exponentially. The first commercial deployments are already being seen but 5G is still in its infancy so it won’t be a matter of simply “flipping a switch” from 4G to 5G.

Along with 5G – increased speed, greater capacity and lower latency – comes huge possibilities for disruptive innovations. There was no limit to 5G talk and imagination at CES 2020. And, of course, there were both pronouncements and announcements on the topic around the coming of 5G handsets. AT&T and Verizon are aggressively developing the infrastructure in an attempt to get out ahead of competition across the globe.

5G will be the “central nervous system of the data age,” according to Steve Koenig, VP, Research at the Consumer Technology Association (CTA).

Martha Bird and others and CES 2020

[Inset above] ADP’s Business Anthropologist Martha Bird (right) took the stage at CES 2020. Bird’s panel “Emerging Technologies Enabling Enterprise” was moderated by Michael Miller, Editor-in-Chief at PC Magazine (middle) joined by fellow panelist Yonatan Wexler, Executive VP of R&D at OrCam Technologies (left).

2. IoI (Internet of Intelligence): The Decade of Connected Intelligence

Just as we were getting accustomed to the term IoT (Internet of Things) the talk this year was around IoI or “Internet of Intelligence.” This new way of thinking is a direct response to the way AI is being integrated into all facets of our technology and consumer culture.

We were told in the plenary keynote that as networks grow, we can expect 5G to unlock more opportunities for enterprise. Building upon what we’ve seen with IoT technologies (think smart home apps that rely on little bits of discrete data), the expansion of 5G and AI capabilities will provide multiple nodes of data informing a much more complex and inter-dependent data landscape. Enterprise applications are expected to lead in IoI in part because of massive data resources and the ability to form mutually beneficial partnerships between OEM, software and engineering. IoI covers things like remote robotic surgery and smart cities. Activities with a heavy data lift and, generally speaking, much higher stakes than let’s say a voice activated light in your home.

3. XR: The New Reality Training Our Future Workforce

XR – the latest technology encompassing augmented, virtual, and mixed reality technologies. Think virtual world up, down, left, or right, and experienced in 360 degrees. Form factors delivering this technology ranged from 5K gaming chairs to sleek eye glasses very much unlike the early Google glasses. Again, enterprise will have a big stake in this area with many use cases including B2B workforce training, safety inspections, AR glasses used by an architect to design a room, training surgeons across geographies, and in travel and tourism where you are able to take a trip to a tropical island right from your living room. Frankly, I prefer the actual trip but foregoing the lines at the airport and customs does sound appealing. Regardless of my preference, there was a lot of excitement for XR in commercial and industrial settings. Not to mention eSports which realized $1 billion in net revenue last year alone.

4. Culture: Pragmatics of Technological Innovation

While attending a panel discussion on “Future Cities” I was struck by a similarity between re-architecting an existing urban space to accommodate new technologies and the work we do at ADP.

A former secretary of transportation listed one of the greatest challenges to innovating cities as the pre-existing roadway infrastructure. He went on to say that between the legacy streets and traffic patterns it was actually the inability to imagine new ways of mobility that was the major barrier.

People get accustomed to “how things are done here” and find it difficult to adapt to changes in the system. This is a cultural and technical matter. Culture, at the most basic level, is the collection of practices and beliefs we take for granted. These habits are slow to change. New technical opportunities can catalyze innovation and cultural change, but this process is never a one-to-one.

Which brings me to humans.

5. Humans: Agency in a Data-driven Era

Humans (people like you and me) when faced with the explosion of new technologies – tech that augments our vision, our speech, our bodies and, even, our memory – begin to question their own reason for being. The existential ponderings around what it means to be human are concomitant with those group of technologies loosely described as “AI”.

Talk of “machine-human partnership” was pervasive on the CES exhibition floor and in panels and keynotes. For my part, I welcome the question as it points to a shared humanity that we often overlook. Yes, partnerships between people and technology will continue to evolve. Who has agency over the relationship will remain a critical point of personal reflection and public debate.

« All Blogs

Anshuman Gaur

Meet Anshuman: ADP’s Inventor of the Year