A Way To Use notion AI as An API

In the past year, the pace of AI development has been incredibly rapid. However, for end-users, there persist two major billing concerns: separate billing for each service and usage-based billing.

Typically, end-user applications do not charge based on usage. For instance, services like Notion AI, Cubox, and Perplexity require users to pay for each use case separately, and the costs are not necessarily low.

On the other hand, usage-based billing is more common in the open-source community, which is quite active. One could replace proprietary solutions entirely with open-source alternatives and then utilize APIs from OpenAI or Claude to integrate AI functionalities into their existing workflows. However, this means incurring costs based on usage volume, which can become prohibitively expensive for individuals in scenarios involving heavy use, such as managing a knowledge base or filtering information streams.

A personal challenge I faced was wanting to create an AI-powered reading library without having to pay extra for Cubox. How should I approach this?

Last year, I discussed creating a reading library centered around Notion in my article, “Building a Three-Level Information Reading Workflow with RSS and Notion AI,” which included using Notion AI. What I didn’t mention was that I’ve been using Cubox Pro as a front-end web clipper for Notion—because Notion’s web clipping plugin performs poorly, often saving only the title of most domestic websites and articles.

Recently, I stumbled upon an open-source “read-it-later” project called Omnivore, which I found to be quite effective in supporting domestic websites. It parses content server-side, meaning it’s not limited by client-side browsers. For example, it can fetch and simplify the layout of WeChat articles by just inputting the article link, processing everything on the server side.

As an open-source project, Omnivore doesn’t provide AI services, but it does offer an API.

So, I employed n8n, an open-source tool similar to IFTTT, and patched together an integration workflow from Omnivore to Notion:

Here’s an example n8n workflow (you’ll need to modify it for personal use): Omnivore_To_Notion.json (available on Google Drive).

I encountered three main challenges during this process:

Firstly, the default output format from Omnivore is Markdown, while the Notion API expects input in a JSON package. To address this, I used a third-party n8n node, n8n-nodes-notionmd, for data conversion.

Secondly, the official n8n Notion node does not support the direct push of raw Notion JSON, necessitating the use of an HTTP Request to write one myself.

Lastly, the Notion API’s method for appending block children (Append block children) allows for a maximum of 100 blocks per push. Therefore, it was necessary to split the articles into sections, as seen in the looping process at the end of the workflow.

The ultimate achievement of this workflow is that every time I bookmark a new article on Omnivore, it automatically pushes the full text to a specified database in Notion. Following my previous settings, Notion then uses AI to generate a summary for the article. This method allows for the quick bookmarking of full article texts into a Notion reading library from any device, proving to be more efficient than both Save to Notion and the Notion Web Clipper.

However, during the implementation of this workflow, I realized that it’s possible to use Notion as a makeshift AI API, which works as follows:

  1. Create a new Notion page or database.
  2. In the page properties, set up an AI Autofill field with a pre-designed prompt.
  3. When AI assistance is needed, use the API to push the context to this page.
  4. The update of the page content triggers the AI Autofill.
  5. After a short delay, use the API to call back the page’s AI Autofill field.

For instance, you could create a Chrome extension that serves as an alternative to Perplexity. With each click, it automatically integrates the full text of Google’s first-page search results into a Notion Page. Then, it utilizes the AI Autofill field to synthesize an answer from all the content on the page. (This is something I’m attempting to do.)

In scenarios where you use Notion AI as an AI API, you might not even need to push content into the body of Notion pages (which requires content segmentation). Instead, you can write the content directly into a property. This is because AI Autofill can read other properties on the same page.

The only problem with this method is the delay in triggering AI Autofill, which can sometimes be significant. This might not be ideal in scenarios requiring immediate feedback.

However, at this stage, I believe this approach can outperform many AI apps that have only innovated at the application layer.

It’s not advisable to exploit Notion AI for large-scale API usage, as behavior significantly beyond personal use is likely to result in account suspension.

评论尸 Avatar

如果你觉得本文有信息增量,请:

喜欢作者

 

精选评论

  1. Ryan An Avatar

    This is a very impressive and helpful post. I’m trying to download an n8n JSON file, but I need to sign up for Alibaba, and it’s difficult for foreigners without a mobile phone number. Could you possibly share the n8n JSON file through GitHub or another method?

    1. 评论尸 Avatar

      Done, I updated the download link in this post to Google Drive.

      1. Ryan An Avatar

        I was monitoring in case you might answer, and I really appreciate you sharing! This seems to be a great help in studying n8n!