LinkedIn's New Posts API: The Good, The Bad, and The Ugly

jnv

Jan Vlnas

Posted on April 11, 2023

LinkedIn's New Posts API: The Good, The Bad, and The Ugly

Last summer, LinkedIn announced new API versioning and plans to migrate existing API endpoints to the new versioning scheme along with some other improvements. The first set of endpoints to be migrated by LinkedIn were Posts API, responsible for working with users' and business profiles' posts.

There was also a deadline: by February 2023, existing API users must migrate to the new Posts API and the old endpoints will stop functioning. That gave integration partners around 8 months for migration, but apparently that was not sufficient. So on the last day of February, LinkedIn announced a deadline extension to June 30.

Since I've migrated LinkedIn's integration through Superface, I wrote down a few notes about the overall experience and frustrations with LinkedIn's new API and this particular deprecation.

The Good Parts

The API is better

I used to show the LinkedIn API as an example of poor API design. There were three endpoints for handling users' and organizations' posts (UGC Posts, Shares, and Posts API which was in beta for a long time), with seemingly overlapping features.
With weird, ad-hoc syntax for field projections and annoyingly long property names, it wasn't the most pleasant API to work with.

I'll have to find another poorly designed API now because the new Posts API is definitely an overall improvement. Deeply nested structures with confusing properties and arbitrary nesting of arrays – it's all gone.

To illustrate the difference, here is the same post as represented by the legacy ugcPosts API and the new versioned Posts API:

Post from ugcPosts API (legacy)

{
  "lifecycleState": "PUBLISHED",
  "specificContent": {
    "com.linkedin.ugc.ShareContent": {
      "shareCommentary": {
        "inferredLocale": "en_US",
        "attributes": [],
        "text": "Don't forget the image."
      },
      "media": [
        {
          "description": {
            "attributes": [],
            "text": "Image"
          },
          "media": "urn:li:digitalmediaAsset:D4E10AQE71V5w_-aalA",
          "thumbnails": [],
          "overlayMetadata": {
            "tapTargets": [],
            "stickers": [],
            "overlayTexts": []
          },
          "status": "READY"
        }
      ],
      "shareFeatures": {
        "hashtags": []
      },
      "shareMediaCategory": "IMAGE"
    }
  },
  "visibility": {
    "com.linkedin.ugc.MemberNetworkVisibility": "PUBLIC"
  },
  "created": {
    "actor": "urn:li:person:bDTsVFtMTq",
    "time": 1679663763610
  },
  "author": "urn:li:organization:2414183",
  "clientApplication": "urn:li:developerApplication:208506072",
  "versionTag": "0",
  "id": "urn:li:share:7045020441609936898",
  "firstPublishedAt": 1679663764088,
  "lastModified": {
    "actor": "urn:li:csUser:7",
    "time": 1679663764133
  },
  "distribution": {
    "externalDistributionChannels": [],
    "distributedViaFollowFeed": true,
    "feedDistribution": "MAIN_FEED"
  },
  "contentCertificationRecord": "{\"originCountryCode\":\"nl\",\"modifiedAt\":1679663763588,\"spamRestriction\":{\"classifications\":[],\"contentQualityClassifications\":[],\"systemName\":\"MACHINE_SYNC\",\"lowQuality\":false,\"contentClassificationTrackingId\":\"F00EA9A4CF8AA4C780241D4CE87D5E87\",\"contentRelevanceClassifications\":[],\"spam\":false},\"contentHash\":{\"extractedContentMd5Hash\":\"7871A7EF3ADBC18955073E68D2203F27\",\"lastModifiedAt\":1679663763587}}"
}
Enter fullscreen mode Exit fullscreen mode

Post from the Posts API (new, versioned)

{
  "isReshareDisabledByAuthor": false,
  "createdAt": 1679663763610,
  "lifecycleState": "PUBLISHED",
  "lastModifiedAt": 1679663764133,
  "visibility": "PUBLIC",
  "publishedAt": 1679663764088,
  "author": "urn:li:organization:2414183",
  "id": "urn:li:share:7045020441609936898",
  "distribution": {
    "feedDistribution": "MAIN_FEED",
    "thirdPartyDistributionChannels": []
  },
  "content": {
    "media": {
      "altText": "Image",
      "id": "urn:li:image:D4E10AQE71V5w_-aalA"
    }
  },
  "commentary": "Don't forget the image.",
  "lifecycleStateInfo": {
    "isEditedByAuthor": false
  }
}
Enter fullscreen mode Exit fullscreen mode

Clear versioning and deprecation policy

The legacy APIs were all “version 2” as to be distinguished from even older “version 1 endpoints”. However, there was no further granularity in these versions. It seems to me that new features were introduced on new endpoints, which is probably why they ended up with three different endpoints for posts.

For the versioned API “reboot” LinkedIn chose a calendar-based versioning scheme. Each version is identified by year and month (e.g., 202303) and no guarantees about breaking changes between versions (as opposed to semantic versioning). Per documentation, each API version is supported for one year and specifying the version is mandatory for each call. Therefore, one can quickly see how much their integration code is behind the current versions.

I think calendar-based versioning is a right call for quickly evolving APIs, and it seems to be an ever more popular choice. GitHub announced a similar versioning scheme in November. And combined with a stable support window (something what Facebook is doing with Graph API), it provides a predictability and stable pace for API consumers.

However, it's not clear to me how exactly the deprecation will be handled, which I address below.

Communication to integration partners

Maybe my expectations about LinkedIn API were too low, but I was pleasantly surprised how well the communication about deprecation was handled. There was a relatively long transition period of eight months (now extended), and all consecutive email communication and documentation pages contained a big reminder about the deprecation.

Still, it's pretty usual the emails go amiss and no one checks the API documentation. The integration is working now, so why'd I need to check the docs? But LinkedIn did use the communication channels they have with partners to get the message across.

Responsive support

Even more important was responsive support. While previously LinkedIn recommended asking questions using linkedin-api tag on Stack Overflow (which usually went unanswered), now they provide a support portal. I've submitted a ticket, and to my surprise, I've received a helpful answer from a support representative in less than 24 hours. While I'd prefer a public forum where I could search for existing solutions first, having a working support channel is an improvement in itself.

The Bad Parts

While LinkedIn got many things right, there are a few things which bug me.

The clean shut-off

At this point, it's clear that the 8-month transition period was either too optimistic, or a planned “soft deadline” from the start. API deprecation is constant pain, since you need to wait on your integration partners to make the changes. If your partners are paying for your product, you want to avoid pulling the rug from them. But if your partners are big enterprises, you can't expect them to react quickly to your changes. But I think LinkedIn could have done a few things to hasten the migration.

At this point, it's not clear what actually happens when LinkedIn shuts off the deprecated endpoints. Will they return an HTTP status error 410 Gone with a JSON message explaining the situation? Will they return a proxy error as HTML page? Or will they redirect to a Rick Roll? (Probably not the last option.)

One way to test the migration readiness is to schedule planned “brownouts”. For example, GitHub uses this strategy for API deprecation (see authentication changes notice for an example). GitHub schedules multiple 12 to 48 hour outages over the months before the deprecation, to simulate the final removal of the API. This is a great way to check whether the migration is complete as typically there's that one more call no one migrated yet.

I'm uncertain if this is an acceptable strategy for LinkedIn, both from a technical and business standpoint. But it seems more sensible to me than just to pull the plug on the final day.

Removed features in favor of simplicity

I mentioned that the new API removed field projections with weird and poorly documented syntax. The downside is that there's no equivalent feature in the new API.

My typical use case for projections was to grab images and videos in posts with a single API request. To achieve the same functionality now, I need to collect media IDs from posts and resolve them with separate API calls. I believe this leads to much simpler implementation on LinkedIn's side, and it can encourage clients to cache referenced media, but it still shifts some complexity on the client's side.

Not-so-opaque object IDs

And speaking of media resolution, here's another catch.

Take a look at these objects from the Posts API response:

[
  {
    "id": "urn:li:ugcPost:7044823133844885504",
    "commentary": "Post A",
    "content": {
      "media": {
        "title": "Some title",
        "id": "urn:li:video:C5605AQHzRSAmLcHkTA"
      }
    }
  },
  {
    "id": "urn:li:share:7046413622230614016",
    "commentary": "Post B",
    "content": {
      "media": {
        "id": "urn:li:image:D5622AQHy2GLswHBoSg"
      }
    }
  }
]
Enter fullscreen mode Exit fullscreen mode

Now, can you tell which post contains a video, and which contains an image?

Obviously, you can tell by the id property (urn:li:video vs. urn:li:image) but you shouldn't have to. IDs should be opaque values.

LinkedIn has separate endpoints to resolve images and videos, so you need to do string match the ID to figure out which endpoint to call.

Here are a few approaches how this could be improved:

  • Provide a new endpoint for resolving any media, where I can pass both video and image IDs (e.g., /rest/media?ids=List(urn:li:video:...,urn:li:image:...)).
  • Add a new property identifying the type of media:
  {
    "id": "urn:li:share:7046413622230614016",
    "commentary": "Post B",
    "content": {
      "media": {
        "type": "image",
        "id": "urn:li:image:D5622AQHy2GLswHBoSg"
      }
    }
  }
Enter fullscreen mode Exit fullscreen mode
  • Or provide me with a link to the resource (although it doesn't match the style of this API):
  {
    "id": "urn:li:share:7046413622230614016",
    "commentary": "Post B",
    "content": {
      "media": {
        "self": "https://api.linkedin.com/rest/images/urn:li:image:D5622AQHy2GLswHBoSg",
        "id": "urn:li:image:D5622AQHy2GLswHBoSg"
      }
    }
  }
Enter fullscreen mode Exit fullscreen mode

The Ugly Parts

Some API changes are painful and ugly, but they have their reasons and maybe they'll be resolved in time.

Scrape it yourself, will you?

Most social media, like Facebook or Twitter, automatically generate a “preview card” from a link contained in a post. There are some slight differences when publishing with API, for example, Facebook accepts a custom title, description, and thumbnail for the preview card – as long as the link points to a domain with verified ownership. Twitter, on the other hand, doesn't allow any preview customization during publishing.

LinkedIn used to generate a link preview automatically when a post was published through the legacy ugcPosts API. In the Posts API, this functionality has been removed:

Posts API does not support URL scraping for article post creation as it introduces level of unpredictability in how a post is going to look when API partners create it. Instead, API partners need to set article fields such as thumbnail, title and description within the post when creating an article post.

Fundamentally, I agree with this approach. I've experienced first-hand customer complaints about articles with missing or incorrect thumbnails. Usually these were caused by a disparity between the preview generated by a 3rd-party application, and LinkedIn's preview scraper. Furthermore, LinkedIn's scraped previews were impossible to refresh, so sometimes I had to instruct customers to add dummy query string to the links they share just to get a correct preview.1

So putting the responsibility for generating a link preview fully on the API clients' side makes sense. Still, it adds an extra complexity to the publishing process.

Sure, you could just willy-nilly scrape arbitrary pages you want to publish. But every so often that won't work. I've dealt with websites whose owners were paranoid about any scraping, and blocked any requests coming from unknown bots. In the end, they allowlisted our link preview scraper, but it took some negotiation.

LinkedIn could simplify this process, and help both developers and paranoid site owners, by providing a separate API endpoint for scraping URL previews. Similar to Facebook, which has this feature.

The migration guide is somewhat useless

LinkedIn provides convenient migration guides for individual APIs.
Unfortunately, the Content APIs migration guide left me struggling with the new API. Sure, it describes how individual fields in schemas were renamed, simplified, or removed (although a direct JSON to JSON comparison would be probably more descriptive) and it briefly describes how the workflow changed. But it's still too brief.

If you need me to change the workflow, show me step by step how it differs from the old one. Even better, show me some code. The guide also doesn't mention anything about the removal of field projections, but maybe the usage of this feature was far too marginal.

Not-so-clear deprecation strategy

The versioning guide mentions that LinkedIn expects their partners to “keep with them”:

LinkedIn expects that our LinkedIn Marketing API Program API partners work to deliver the latest and most valuable experiences to our customers within a reasonable time of their availability. As a result, we will sunset our API versions as early as one (1) year after release.2

Since no versioned API reached its end-of-life yet, I have yet to see what “sunsetting an API version” means. I think it could be one of these options:

  1. The requests start immediately return an error on the first day of the 13th month. (But what error? What status code?)
  2. The requests will probably work for some time, but can break at any time – it's your risk to call an outdated API.
  3. We will automatically redirect your outdated calls to a newer API version, and it will work as long as we don't introduce any breaking changes to the request schema.

The third option is something of what Facebook does. I occasionally run into code using 5+ years old API versions,3 and it still works. I suspect LinkedIn could go with the first or second option. If this is the case, I hope they'll provide developers with an early warning that their integrations are about to break. In other words:

  • Provide developers with API versions usage in app analytics.
  • Describe in gory details what exactly happens after the API reaches the end-of-life.

Conclusion

So, that's probably far too many words about the new LinkedIn API. Despite my criticism, I think it's still an overall improvement, and I'm glad LinkedIn takes the developer experience seriously, unlike other social media (ahem).


  1. Unlike Facebook, which provides a convenient tool for debugging and refreshing link previews. 

  2. Emphasis mine. 

  3. In passport-facebook, for example. 

💖 💪 🙅 🚩
jnv
Jan Vlnas

Posted on April 11, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related