These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.16 Oct 2017
APIs are not forever, and eventually will go away. The trick with API deprecation is to communicate clearly, and regularly with API consumers, making sure they are prepared for the future. I’ve been tracking on the healthy, and not so healthy practices when it comes to API deprecation for some time now, but felt like Google had some more examples I wanted to add to our toolbox. Their approach to setting expectations around API deprecation is worthy of emulating, and making common practice across industries.
The Google Adwords API team is changing their release schedule, which in turns impacts the number of APIs they’ll support, and how quickly they will be deprecating their APIs. They will be releasing new versions of the API three times a year, in February, June and September. They will also be only supporting two releases concurrently at all times, and three releases for a brief period of four weeks, pushing the pace of API deprecation alongside each release. I think that Google’s approach provides a nice blueprint that other API provides might consider adopting.
Adopting an API release and sunset schedule helps communicate the changes on the horizon, but it also provides a regular rhythm that API consumers can learn to depend on. You just know that there will be three releases a year, and you have a quantified amount of time to invest in evolving integration before any API is deprecated. It’s not just the communication around the roadmap, it is about establishing the schedule, and establishing an API release and sunset cadence that API consumers can be in sync with. Something that can go a lot further than just publishing a road map, and tweeting things out.
I’ll add this example to my API deprecation research. Unfortunately the topic is one that is widely communicated around in the API space, but Google has long a strong player when it comes to finding healthy API deprecation examples to follow. I’m hoping to get to the point soon where I can publish a simple guide to API deprecation. Something API providers can follow when they are defining and deploying their APIs, and establish a regular API release and deprecation approach that API developers can depend on. It can be easy to get excited about launching a new API, and forget all about it’s release and deprecation cycles, so a little guidance goes a long way to helping API providers think about the bigger picture.
I spend my days mapping out the API life cycle, keeping track of what I consider to be the 50+ areas of a modern API life cycle, based upon the approach I am seeing from leading providers. One area of this life cycle I'm spending way more time than I want to be lately, is in the area of API deprecation.
This is why I conduct my API research in the way that I do, because when people approach me for advice, guidance, or brain dump in any of these areas, I have a wealth of resources I can pull from. Sadly I have a couple of folks ask me for input on what I'd consider to be a good approach to individual API deprecation, as well as entire company, platform, and API deprecation.
Sadly, there very few "good approaches" to shutting down your APIs in the wild. Luckily there are quite a number of very bad approaches available, that we can reverse engineer, and begin putting together a scaffolding for what might be some possible best practices, when it comes to API deprecation.
I went through what I had curated as part of my API deprecation research, and assembled these common building blocks:
- Individual API Endpoints - Looking to have a plan for the deprecation of individual APIs, or sets of APIs.
- Individual API Tooling - An approach to how you deprecate one of the client solutions available on API platform.
- Entire Platform Shutdown - What is the plan for an entire platform shutdown, ceasing all API operations.
- Enter Platform Safe Mode - What is the plan for just putting the platform into safe mode, and still operating.
- Initial Expectation Set - What expectations are, or have been set early on in the API design, and deployment parts of the life cycle when it comes to API deprecation.
- First Decision - Establishing that an API or platform will need to be deprecated. Recording the date, time, and other key details on what this decision was reached.
- First Notice - What is the time to the first external notice, either privately to partners, or also publicly to the general community.
- Lock-down New Signups - When will new signups for the platform be locked down, stopping all new user and application registration.
- Lock-down New Writes - When will all POST, PUT, and other write capabilities be locked down, making it easier to stabilize, sync, and migrate for final shut down.
- Light Switch Flicking - Will there be light switch flicking, or dark-out testing, which send real world pain to API consumers, reminding them that a shutdown is imminent.
- Other Milestones - What other milestones are there that will be important to internal, partner, and public stakeholders?
- Final Date - What is the final date for shutdown, requiring all syncs, migrations, exports, and integration to completely cease.
- Key Players - As soon as the first decision to deprecate is made, who are the key players, and stake holder that need to be contacted, and made aware of the API deprecation.
- Public Players - Once key stakeholders are made aware, what is the plan to go public, and make the general community, and industry aware that the API deprecation is imminent.
- Schedule (Runway) - What does the entire runway look like, with as complete schedule from first decision, to final date -- including a communication schedule documented along the way.
- Historical - What historical communications will be made available from blog posts, to tweets, documentation, and other communication focused efforts.
- Be Real - Make sure that you are genuine in your plans, communication, and out reach.
- Be Friendly - Be as friendly as you can. It will be a hard time for everyone involved.
- Be Transparent - Try to be as transparent as you possibly can, throughout the process.
- Be Respectful - Showing respect for your consumers, and their challenges will go a long way.
- PR Campaign - Craft a public relations plan, and execute it well. Don't take shortcuts.
- Communicate Often - Make sure and email, post to blog, tweet, and communicate regulatly about the deprecation.
- Data Migration - What data migration tools, services, and support will be made available to help customers get their data off of a platform, and somewhere where they can put to use.
- Settings Migration - Will there be the opportunity to export settings, and other configurations, so that they can be applied into other external systems, either automatically, or manually.
- Data Sync - Are there any real-time data sync opportunities, to help reduce the amount of work that is needed at the time of shutdown, slowly getting API consumers offloaded to their new solution.
- Data Portability - Beyond migration, and syncing, can API consumers get raw dumps of all of their usage data, as well as any other object, and entity they have stored via API operations.
- Data License - How will data be licensed after things are shut down?
- Server Code - What server code will be made available as open source license?
- Client Code - What client code will be made available as open source license?
- API License - How will the API definition, and data schema be licensed for reuse?
- AWS AMI - Will there be an Amazaon AMI made available for any server or client implementations?
- Heroku Deploy - Will there be Heroku deployment made available for any server or client implementations?
- Docker Image - Will there be Docker images made available for any server or client implementations?
- Github Repo - Will a dedicated Github repository be made available for the company, platform, API, or other aspects of opportunity, made available for capturing any remaining code, data, and content.
- Github Issues - Is there a Github issues established for handling all concerns, and conversations around the API deprecation.
- API Purgatory - Have you contacted Kin Lane, the API Evangelist, so he can take a snapshot of your API, for inclusion in the API Purgatory Museum.
- X-API-Warn - Using the X-API-Warn header for all requests made to any deprecated APIs.
- Contact Person - Who is the person (name, email, Twitter, and Github) who is the point of contact for the API deprecation process.
- Migration Partners - Are there any migration partners that can help out during any migrations, syncing, configuration, and setup of external solutions.
- Migration Locations - Are there any other existing platforms, including competitors, that you can point API consumers to, as an alternative to the API being deprecated.
Like the areas of my API life cycle, these API deprecation building blocks are by no means complete. I'm just extracting some of the comment elements that I see when I look through just the little bit of API deprecation history that I've managed to curate. The trick now, is to find more time so that I can dig deeper, and find the API deprecation stories I do not have indexed, and hopefully establish a timeline of sorts, along with some of the other possible building blocks that might help folks craft their own deprecation strategy.
I'd love to hear your thoughts on what you think should be considered as part of the inevitable API deprecation process. I thought my friend Mark Boyd (@mgboydcom) did a good job capturing the reality of API deprecation from a provider view, as well as what things look like from the API consumer vantage point, if you need a good narrative of what I'm trying to achieve.
Ultimately I think this is a conversation we need to be having more often, with more API providers, out in the open. I'd like to see more platforms publish what their API deprecation strategy is, either as a blog post, or in a more formal way. I'll spend more time next week, digger deeper in the past to find some of the more high profile API deprecation that I'm missing in my research, gather feedback from folks I'm talking to about their API deprecation strategies, and hopefully evolve this list of common API deprecation building blocks forward more.
I'm in the middle of a sprint, where I am going through 50 of my main API stacks, to see what has changed, and who is still home. I'm always fascinated by the number of APIs that just fade away into a 301 redirect to a domains home page. Some projects get gobbled up by domain squatters, where others almost rise to the level of API deprecation art.
I might get this one framed. I thought the background, combined with the message was a great representation of the current state of API affairs. Its getting harder and harder to operate an API, keep it up and going, and live up to the hype and expectations.
I am going to start saving more snapshots of what happens when an API goes away--who knows maybe some day I'll have an interesting collection.
It always makes me smile, when I talk to someone about one or many areas of my API research, sharing how I conduct my work, and they are surprised to find how many areas I track on. My home page has always been a doorway to my research, and I try to keep this front door as open as possible, providing easy access to my more mature areas like API management, all the way to my newer areas like how bots are using APIs.
From time to time, I like to publish my API life cycle research to an individual blog post, which I guess puts my home page, the doorway to my research into my readers Twitter stream, and feed reader. Here is a list of my current research for April 2016, from design to deprecation.
I am constantly working to improve my research, organizing more of the organizations who are doing interesting things, the tooling that is being deployed, and relevant news from across the area. I use all this research, to fuel my analysis, and drive my lists of common building blocks, which I include in my guides, blueprints, and other white papers and tutorials that I produce.
I am currently reworking all the PDF guides for each research area, updating the content, as well as the layout to use my newer minimalist guide format. As each one comes off the assembly line, I will add to its respective research area, and publish an icon + link on the home page of API Evangelist--so check back regularly. If there is any specific area you'd like to see get more attention, always let me know, and I'll see what I can do.
I had one of my followers ask me if there as a “gold standard for API deprecation policies” out there. I’d say that deprecation policies are a little different than some of the other legal building blocks of an API, where the legal is a small portion of it, with the majority really being all about communication.
To answer the question, I recommended Google. Not cause their deprecation policy itself is all the great, it is just because their overall approach to API deprecation is pretty robust—something I’m sure is derived from their experience so far. (wink wink)
API deprecation at Google starts with some pretty clearly stated policies, but goes much further with:
These are just a handful of the building blocks that Google employs as part of their approach to API documentation. This post is just result of a 15 minutes of Googling around, to respond to the question. Now that I have written this post, and put the topic of API deprecation back on my radar, the next step is to dive deeper and develop a complete picture of how Google approaches API deprecation, as well as other leading players in the space.
In the past, keeping our deprecation strategies in sync with our overall product sales and marketing took a full time staff, but in 2019, we now have an automated, API driven software defined deprecation strategy that is in perfect sync, in real-time with our sales and marketing. In the past, we could only flick a switch, when we started marketing a new device, that would inject latency into all of our customers devices--this is a brand new age of connected device sales.
With our new system, we can personalize hardware deprecation scenarios, down to the user, and specific feature. The latest device release is a good example. The 7Z line of our devices allow for faster uploading, and much larger file size, so on all of the 6Z customers, we can actually slow their image uploads, increase error rates, and many more perceived deprecation scenarios, in real-time. The real-time portion, actually syncs these personalized events, with advertising events a user is targeted with. We can tell if a user has seen the new 7Z line commercial on TV or the web, and within 30 minutes, we can execute a deprecation scenario--bringing the pain home.
The operating system for all our devices have this patent pending, Software Defined Deprecation (SDD) layer, which using its APIs connects with the real-time advertising, and marketing layer. For most of a devices life, SDD will not be leveraged, but as soon as sales or marketing efforts kick off, SDD can come into play at the individual level, regional, or specific demographic level. We are still crunching the latest sales figures, and the role SDD has played in generating sales for the 7Z line, but we are confident in Q4 we should be able to achieve 40% conversion rates with customers, purely based upon perceived SDD events.
SDD is no longer a static feature you turn on and off in devices, it is now deeply woven into how we market and sell our Internet connected devices, making earlier concept of planned obsolescence a real-time experience, that we can personalize, target, and implement to optimize our sales. With over 100 devices left in our product catalog, where we can apply the next gen SDD technology, the opportunities for new revenue is unlimited in coming years.
If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.