APIs can expedite old business architecture — but for some organizations it may be a difficult process. Masking a 10+ year old ERP system with a clean interface is no easy task.
Tuning into design lessons learned throughout research, development, and implementation stages, in this post we study a case that epitomizes the mullet in the back philosophy. Read on to learn what processes created a fully functional and partnership critical API as we study the history of the PlanMill API. From simultaneous UI development, Eating your own dog food, to reasoning behind specific architecture decisions, Senior Consultant & Manager Marjukka Niinioja describes to Nordic APIs the nitty gritty details of developing an enterprise level API from scratch.
What is an ERP?
PlanMill is an Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM) application — a real time front and back end project management platform for professional services. The PlanMill API exposes the Planmill ERP platform, allowing for the commercial exchange between various business applications to expedite processes like accounting and CMS. Maybe due to it’s sheer weight and business-oriented subject matter, ERPs are often seen as pretty boring. According to Niinioja:
“ERPs are often seen as grandpa’s sleigh — they are often heavy and hard to maneuver”
The Slow Initial Release
As Niinioja says, the PlanMill API went through a painful “birthing process” — a 12 month period of stress and test. Pretty early to the API game, the PlanMill API 1.0 originated in 2009. A very quiet launch worked well, as it gave the team time to create usable developer documentation, envision a pricing model, speak with consultants, and garner a potential customer user base.
Eventually, some of the first real use cases occurred internally, with larger projects and time reporting integrations. This was followed by a first big API use case within a larger application, including massive improvements to the API. The first partnership was eventually made possible (Atlassian Confluence with Ambientia) using the PlanMill API. According to Niinioja, the presence of an API became an immediate selling point compared to competitors in the ERP space.
API Usability Problems
So, what did the PlanMill team learn along the path to version 1.0? Developer experience is a critical component to your API strategy, and designing documentation and API functionality in a way that caters to the user’s needs is pertinent to success in the API space. Though certain standards and API design guides do exist, actual implementation is almost always a unique case-by-case basis. What PlanMill did correctly was listen to their customer’s responses to improve the overall experience to increase usability.
As the system originally intermingled much functionality with a backend system that was 10+ years old, the following usability concerns arose in the initial use of the API:
- Calls: “Why should this take 5 requests, can’t I consolidate into a single call?”
- Rate limits: Some users would send in 10 requests per ms, crashing the system.
- Documentation: Not every granular function was initially documented.
- Error messages: The users were receiving stack traces as error messages in the API responses.
- Response ID: The ID wasn’t in the response of whatever had just been created with the API.
Improved Process for API 2.0
For the API version 2.0, a new, improved process for development was necessary. Gearing up for a new product lifecycle, the PlanMill team performed impressive customer research to dial into what their customers desired. The PlanMill team stressed iterative changes bolstered by continuous stages of modeling the product both internally and externally to receive feedback from the developer community by demoing and attending API related events.
- Research project: The team created a thesis-driven formal research project to investigate the needs of their customers, asking partners to share their own experiences using the API and thoughts on how to improve the API. They found, in general, that customers were eager to share if it meant helping to improve the system.
- Pilot technologies and first service: Using the research, the team brainstormed what could they do differently if they could start from scratch. They redesigned their technology stack, and masked old things that weren’t a well designed to offer a clean interface to their clients. Sleek in front, mullet in back.
- Demos: What followed was internal discussion, knowledge sharing, attending seminars, and demoing the API.
- Further research: The team read Nordic APIs, researched the community, tried new approaches, decided on architecture and technologies, attended additional seminars, and researched monetization strategies.
- Internal Beta : Operational version developed for internal testing.
- Ate own dogfood w/ new UI: A new user interface was designed simultaneously with backend development. The team found that insights from a UX perspective were critical for creating an intuitively designed API.
- Public beta 1.5: This staged involved developer community testing, and opening the API for third party developers to use. They held a usage seminar, reached a version 1.5, and made incremental updates.
- Feedback from developer community: Currently the team is at this stage.
- Publish 2.0: Plans to publish full platform soon. The team aims to use their research, feedback, new UI, architecture decisions, monetization strategy, and more to culminate in a well-informed and well-prepared version 2.0 release.
”We ate enormous amounts of dog food, I never want to see dog food again”
Specific Architecture Decisions Made Along the Way
One can see the many points where PlanMill revisited the Analysis stage to check the pulse of the users and pivot their product accordingly. Let’s now take a peek under the hood to examine what specific architecture choices were made in response to their research.
Authentication: HMAC and API keys
PlanMill went with HMAC as an authentication scheme as it allows B2B communication via system level integration capabilities. For PlanMill services, server-to-server file sharing is preferred for the transfer of payrolls, for example. Though PlanMill acknowledges they still need to improve their unified identity management process with additional technologies, HMAC is adequate for their current situation involving one user and one set of credentials to talk with another single user and single set of credentials.
Data Format : JSON (and more) over XML
The PlanMill team chose slick JSON, a format that’s simpler to understand and requires less configuration overhead than XML. Some might scoff when they hear that in addition to returning JSON, the PlanMill API produces formats like CSV and… PDF. In their case, this return type actually makes sense as certain business intelligence software receives PDF or CSV formats for their operations.
HTTP Verbs Properly Used
The PlanMill API utilizes standard HTTP verbs (GET, POST, PUT, DELETE). Though not implemented in PlanMill API v1.5, Niinioja advocates the use of PATCH, for the reason being that it can alleviate headaches and bloat that can occur when trying to update delta records. Minute changes in large datasets, such as a minor user info edit, can be made heavyweight with redundant GET and POST requests. The team plans to embrace proper PATCH standards with the 2.0 release.
”We decided to go with RAML because it was a new standard… it was quite easy to adopt. Even then, there were a lot of tools to generate documentation out of it. It’s easy to generate SDKs out of it in various languages”
PlanMill also leverages APImatic to generate SDKs in various programming languages.
REST or SOAP?
Is REST really better than SOAP? Is SOAP better for critical business transactions? There’s an argument for both sides. For PlanMill, SOAP and WSDL have a place in their platform for invoice, account, and payroll. However, as Niinioja describes:
”A good REST API with proper documentation, including JSON schemas, JSON samples, RAML documentation — or some other format — and also error codes, is much, much better. It’s also something that forces people to design it more simpler.”
Sharing Data Carefully and Securely
It’s important to note that in the B2B environment some critical level data or company secrets cannot escape the system, but still must be shared via APIs across partner channels.
Things like bank account numbers, sick leaves, payroll data, etc, need to be shared with systems, but cannot escape out of those boundaries. You also have government and trade treaty control, such as in Finland, PlanMill’s base, where personal data laws place strict confines on what can legally be shared. A company may also have an internal policy controlling shared data. All this can inhibit open integration with platforms like Zapier or IFTT.
In the end, Niinioja recognizes that a business needs to support both openness for sharing basic contact information or other such data that can be shared freely, and point-to-point routes, for sharing API data between partner systems through secure connections.
Understanding the Life of an API
PlanMill understood the lifecycle of their API, bringing an agile mindset to enterprise API platformitization through research and sharing. Each stage in this process plays an important role in advancing the project as a whole, through analysis, development, operations, and versioning.
Though PlanMill still has their work cut out for them, hopefully this case study can give insight into what mistakes were made early on in the process, and how they were rectified, with the hopes that new API practitioners can have a successful release armed with this knowledge. We wish PlanMill a successful 2.0 release, and hope their production saga can guide as a model for others to have success in the API space.
[PlanMill has participated in Nordic APIs past events but did not sponsor this post]