David Yeiser🍔

How to Create a Blog with the Airtable API & Next.js

This is an advanced tutorial on how to pull data from Airtable via their API onto your site with Next.js and Node.js. I say “advanced” because it assumes you know how to install and setup Node.js and Next.js and that you’re already familiar with JavaScript development. The best tutorials never assume reader knowledge, but this one does so I wanted to make that clear in the beginning. If this is your first time with React or Node, or working with APIs, this tutorial will likely frustrate you.

tldr

I’ve put all of the code referenced here in a GitHub repository if you just want to skip straight to that.

! Update 1/15/20

I have decided to retire this tutorial as I no longer use Airtable for a blog. I don’t think it’s a bad idea or anything like that, I have just switched to a different setup and therefore no longer have a reason to keep up with the latest packages and so forth.

I updated the packages on the GitHub repository (and in this tutorial) to be more current and tested it quickly to make sure that it still works — so it’s still a valid reference for now, but just wanted to note that I’ll no longer be making any updates to this tutorial or the GitHub repository.

Overview

This tutorial has nine sections that cover how to setup Airtable as a blogging system with a Next.js-powered site:

1 Server Setup

The first thing to setup is an Express server. If you’re using only Next.js the scripts portion of your package.json may look like this:

"scripts": {

"dev": "next",

"build": "next build",

"start": "next start"

}

What we need to do is have start and dev reference a server.js file rather than next as the starting point. Mine looks like this:

"scripts": {

"dev": "NODE_ENV=development node server.js",

"test": "echo \"Error: no test specified\" && exit 1",

"build": "next build",

"heroku-postbuild": "next build",

"start": "NODE_ENV=production node server.js"

}

The heroku-postbuild is for deploying to Heroku when running Next.js with a custom Express server (more information). Yours will vary depending on which platform you’re using for deployment and hosting.

You’ll also need a variety of packages, here are the ones from the airtable-nextjs-blog repository:

"dependencies": {

"airtable": "^0.5.8",

"babel-plugin-styled-components": "^1.9.2",

"dateformat": "^3.0.3",

"dotenv": "^6.1.0",

"express": "^4.16.4",

"isomorphic-unfetch": "^3.0.0",

"markdown-it": "^8.4.2",

"next": "^7.0.2",

"react": "^16.6.3",

"react-dom": "^16.6.3",

"react-markdown": "^4.0.4",

"redis": "^2.8.0",

"shortid": "^2.2.14",

"styled-components": "^4.1.2"

}

A quick note, if you’re wanting to use styled-components with Next, Server-side rendered styled-components with Nextjs is a tutorial that explains how to set it up. It’s implemented for you in the example project, but if you wanted to know more about it that link is a good one.

Finally, you would now setup server.js to call Next. Here’s a basic setup:

const express = require('express')

const next = require('next')

const dev = process.env.NODE_ENV !== 'production'

const port = process.env.PORT || 3000

const app = next({ dev })

const handle = app.getRequestHandler()

const serialize = data => JSON.stringify({ data })

app.prepare()

.then(() => {

const server = express()

server.get('*', (req, res) => {

return handle(req, res)

})

server.listen(port, (err) => {

if (err) throw err

console.log('> Ready on http://localhost:3000')

})

})

.catch((ex) => {

console.error(ex.stack)

process.exit(1)

})

2 Environment variables

You don’t want to just paste your API key directly into the code — you’ll want to create custom environment variables. The way that I would create these normally did not work with this setup. However I came across a nifty NPM package called dotenv that allows you to store environment variables in a .env file. And then with Next, we need to add some configuration via next.config.js to round it out.

First, install the dotenv package, and then paste this snippet in your next.config.js file:

const { parsed: localEnv } = require('dotenv').config()

const webpack = require('webpack')

module.exports = {

webpack: (config, { buildId, dev, isServer, defaultLoaders }) => {

config.plugins.push(

new webpack.EnvironmentPlugin(localEnv)

)

return config

}

}

And then you need a corresponding .env file (again at the top level) with the variables you will need. So for example, let’s say we want to call our environment variable AIRTABLE_API_KEY; our .env file will look like this:

AIRTABLE_API_KEY=keywkO49kelajkJpW

I just made up that key, but yours would look similar to this. I also added my Airtable Base ID as an environment variable. Now we can reference the API key with process.env.AIRTABLE_API_KEY. At the very top of server.js add this:

// Access .env variables

if (process.env.NODE_ENV !== 'production') {

require('dotenv').config()

}

Lastly, make sure you add .env to your .gitignore file. The GitHub repository has an example setup for all of this.

3 API setup

With that note on environment variables, let’s setup a connection to the Airtable API. We’ll use their official JavaScript library to connect to the API. (Your Airtable API key can be found on your account page.) Run:

npm install airtable --save

After it’s installed, add this to your server.js file:

const Airtable = require('airtable')

Airtable.configure({ apiKey: process.env.AIRTABLE_API_KEY })

Then in the same server.js file add the API call (code below) as a function that can be referenced as needed.

/* Main Airtable Query */

const getAirtablePosts = (baseId) => {

// Each Airtable "Base" has its own ID

const base = new Airtable.base(baseId)

return new Promise((resolve, reject) => {

// Setup empty array to store results

const storeAirtablePosts = []

// Query

const apiQuery = {

pageSize: 50,

sort: [{field: 'Publish Date', direction: 'desc'}]

}

// Go get it!

base('YOUR_TABLE_NAME').select(apiQuery).eachPage((records, fetchNextPage) => {

// This function (`page`) will get called for each page of records.

// The properties here would correspond to your records

records.forEach(function(record) {

const post = {

title: record.get('Title'),

content: record.get('Content'),

publish_date: record.get('Publish Date'),

slug: record.get('Slug'),

id: record.id

}

// Store each result in our empty array

storeAirtablePosts.push(post)

})

fetchNextPage()

}, function done(error) {

// Throw error if exists

if (error) reject({ error })

// Finish

resolve(storeAirtablePosts)

})

})

}

Now with this function we can just call getAirtablePosts('YOUR_BASE_ID') and we get back our Airtable data as an array of objects.

4 Internal API routes

Next, we will setup a custom internal URL that references our getAirtablePosts() function and returns it as JSON. This will allow us to setup our Next pages to call this URL and then do something with the returned data. And once you have the internal URLs setup, you may find other uses for them, for example at the end we’ll look at how to setup a JSON feed.

You may have noticed the serialize function in the initial code for server.js in Section 1, here is where we’ll use that function. Add this code to your server.js file where routes are handled.

// Internal API call to get Airtable data

server.get('/api/get/posts', (req, res) => {

Promise.resolve(getAirtablePosts(BASE_ID)).then(data => {

res.writeHead(200, {'Content-Type': 'application/json'})

return res.end(serialize(data))

}).catch((error) => {

console.log(error)

// Send empty JSON otherwise page load hangs indefinitely

res.writeHead(200, {'Content-Type': 'application/json'})

return res.end(serialize({}))

})

})

What this does is setup a URL of yourdomain.com/api/get/posts that when accessed returns the Airtable data as JSON. That’s it, pretty straightforward. With this, we have everything we need on the server end of our application and can now move to the front-end.

5 Retrieve & parse data with Next.js

I’m assuming that you have Next.js installed and are familiar with its setup. What we’ll do in this section is setup our “home” page, index.js to retrieve and display our Airtable data. I’m also assuming you have a base Layout component that is used as the overall framework for the site (see example). We’ll create the Post component after we get index.js setup initially.

In pages/index.js add the following code:

import fetch from 'isomorphic-unfetch'

import Link from 'next/link'

import shortid from 'shortid'

import Layout from '../components/Layout'

import Post from '../components/Post'

class Home extends React.Component {

constructor() {

super()

this.state = {

airtablePosts: []

}

}

componentDidMount() {

const { props } = this

const transferPosts = new Promise((resolve) => {

const collectPosts = []

Object.keys(props).map((item) => {

// Filter out other props like 'url', etc.

if (typeof props[item].id !== 'undefined') {

collectPosts.push(props[item])

}

})

resolve(collectPosts)

})

Promise.resolve(transferPosts).then(data => {

this.setState({ airtablePosts: data })

})

}

render() {

const { airtablePosts } = this.state

if (!Array.isArray(airtablePosts) || !airtablePosts.length) {

// Still loading Airtable data

return (

<Layout>

<p>Loading&hellip;</p>

</Layout>

)

}

else {

// Loaded

return (

<Layout>

{airtablePosts.map((post) =>

<Post

key={shortid.generate()}

title={post.title}

content={post.content}

publish_date={post.publish_date}

slug={post.slug}

id={post.id}

/>

)}

</Layout>

)

}

}

}

Home.getInitialProps = async (context) => {

const basePath = (process.env.NODE_ENV === 'development') ? 'http://localhost:3000' : 'https://yourdomain.com'

const res = await fetch(`${basePath}/api/get/posts`)

const airtablePosts = await res.json()

return airtablePosts ? airtablePosts.data : {}

}

export default Home

There’s a lot going on here! Let’s look at a few sections to make sense of it.

First, a key function is the getInitialProps at the end. This is a unique function provided by Next.js on top level pages that can be used to initialize the page with data already present in props.

What Home.getInitialProps does is tell Next.js to first call our internal API route and retrieve the Airtable data returned via the API to be used on the page. Note the placeholder for “yourdomain.com” — this will be necessary for when you go to production, in your local development environment it will use localhost, but make sure to change the port number if you’re not on 3000.

Once that has happened, the array of objects is loaded in props and can be accessed as needed. In componentDidMount() we access props, filter out what we don’t need (props is returned with other items besides our Airtable data), and then update state with the Airtable data.

In render(), we await data from state and once it has been updated we loop through the array with map and display each Airtable record as a post.

(The shortid.generate() is from the shortid package referenced at the top. It’s a handy little package for generating unique ids.)

Setup Post component

The <Post> component is what displays the data, here’s a quick example of how it could be setup:

import Link from 'next/link'

import dateFormat from 'dateformat'

import Markdown from 'react-markdown'

class Post extends React.Component {

render() {

const {

title,

content,

publish_date,

slug,

id

} = this.props

const permalink = !!id ? '/post/' + id + '/' + slug : false

return (

<div>

{!!permalink ?

<Link href={permalink}>

<a title="Permalink for this note">

{title && <h2>{title}</h2>}

{publish_date &&

<time dateTime={dateFormat(publish_date, 'isoDateTime')}>{dateFormat(publish_date, 'mmmm d, yyyy')}</time>

}

</a>

</Link> :

<div>

{title && <h1>{title}</h1>}

{publish_date &&

<time dateTime={dateFormat(publish_date, 'isoDateTime')}>{dateFormat(publish_date, 'mmmm d, yyyy')}</time>

}

</div>

}

{content && <Markdown source={content} /> }

</div>

)

}

}

export default Post

The Markdown component is key for rendering final HTML with Airtable, otherwise it will just be plain text. For the permalink setup, you don’t have to add a slug, all you will need is the id but I set up mine this way to give the URL a bit more context and I suppose it’s a little more friendly to the search engines this way. But it is superfluous for data retrieval.

The id is necessary because it is what will be used to retrieve a single record from the Airtable API when just that item is needed, which allows permalinks for each record, or post. The conditional is so that the permalink is not rendered on the permalink page. This will make a little more sense when we tie everything together at the end.

At this point, you should be able to load your localhost URL and see the Airtable data displayed as posts.

6 Set up API call for single record

In order to make our blog feature complete we need a way to retrieve a single record from Airtable based on a URL. In other words, we need a permalink and view for a single blog post.

First, let’s create another function to connect to Airtable and retrieve a single record. Airtable has a find method that takes the record ID as the single lookup parameter. Then we structure the return data to be a single object that contains our post content. Here’s the function to add to server.js:

/* Get Individual Airtable Record */

const getAirtablePost = (recordId, baseId) => {

const base = new Airtable.base(baseId)

return new Promise((resolve, reject) => {

base('YOUR_TABLE_NAME').find(recordId, function(err, record) {

if (err) {

console.error(err)

reject({ err })

}

const airtablePost = {

title: record.get('Title'),

content: record.get('Content'),

publish_date: record.get('Publish Date')

}

resolve(airtablePost)

})

})

}

As we did for the main API call, let’s setup another internal URL that uses this function to retrieve our data from Airtable and return it as JSON. Here’s the code:

// Internal API call to get individual Airtable post

server.get('/api/post/:id', (req, res) => {

Promise.resolve(getAirtablePost(req.params.id, BASE_ID)).then(data => {

res.writeHead(200, {'Content-Type': 'application/json'})

return res.end(serialize(data))

}).catch((error) => {

console.log(error)

// Send empty JSON otherwise page load hangs indefinitely

res.writeHead(200, {'Content-Type': 'application/json'})

return res.end(serialize({}))

})

})

Note that in our URL the id is prefaced with a colon, which informs Express that this portion of the URL will be variable. This will be the record id that is then passed to the getAirtablePost function via req.params.id. As with the other internal URL, the information is parsed and returned as JSON.

8 Caching

We have a functioning blog but it’s likely we don’t want to hit Airtable for every pageview. A more ideal setup would be to store the results from Airtable in a cache with an expiration date. Then we can add an extra step to our getAirtablePosts and getAirtablePost functions to check for the data in the cache before calling on Airtable’s API.

With Next.js, the recommended way to do this is with a microservice, specifically something like micro-cacheable. That said, I couldn’t figure it out! Or rather — to save face — I was running out of the time that I had set aside to implement a blog on my site and so I went with Redis.

To my surprise, Redis was easy to implement in development and production (using Heroku).

First, install the Node.js Redis client:

npm install redis --save

Then what I did was create a new cache.js file in the top level of the project directory, set up Redis there, and then import it to server.js via a single cache method.

Here is the code for cache.js:

const redis = require('redis')

const client = redis.createClient()

// Log any errors

client.on('error', function(error) {

console.log('Error:')

console.log(error)

})

module.exports = client

As you can see, pretty simple. We import Redis and then initiate it on client. Then in server.js we’ll assign this module to the const cache. Near the top of server.js add this:

const cache = require('./cache')

Now we have access to caching with Redis. To store information with an expiration date we use setex like so:

cache.setex('cacheId', 30, JSON.stringify(data))

Where cacheId is a unique identifier for the cache, 30 is the number of seconds to keep the cache, and the last part is the actual data that is stored. I went ahead and used JSON.stringify() as part of this example because that is how we’ll stash our objects and arrays in the cache, as strings.

Retrieving data is just as easy:

cache.get('cacheId', function(error, data) {})

If the cache has expired then the return value will be empty and we can act accordingly.

To implement the cache, we wrap our getAirtablePosts and getAirtablePost functions in a cache check and add a command to store the new data when we do have to make an API call to Airtable.

So our new getAirtablePosts function looks like this:

/* Main Airtable Query */

const getAirtablePosts = (baseId) => {

const base = new Airtable.base(baseId)

return new Promise((resolve, reject) => {

cache.get('airtablePosts', function(error, data) {

if (error) throw error

if (!!data) {

// Stored value, grab from cache

resolve(JSON.parse(data))

}

else {

// No stored value, retrieve from Airtable

const storeAirtablePosts = []

// Query

const apiQuery = {

pageSize: 50,

sort: [{field: 'Publish Date', direction: 'desc'}]

}

// Go get it!

base('YOUR_TABLE_NAME').select(apiQuery).eachPage((records, fetchNextPage) => {

// This function (`page`) will get called for each page of records.

// The properties here would correspond to your records

records.forEach(function(record) {

const post = {

title: record.get('Title'),

content: record.get('Content'),

publish_date: record.get('Publish Date'),

slug: record.get('Slug'),

id: record.id

}

storeAirtablePosts.push(post)

})

fetchNextPage()

}, function done(error) {

if (error) reject({ error })

// Store results in Redis, expires in 30 sec

cache.setex('airtablePosts', 30, JSON.stringify(storeAirtablePosts))

// Finish

resolve(storeAirtablePosts)

})

}

})

})

}

And our updated getAirtablePost like this:

/* Get Individual Airtable Record */

const getAirtablePost = (recordId, baseId) => {

const base = new Airtable.base(baseId)

const cacheRef = '_cachedAirtableBook_'+recordId

return new Promise((resolve, reject) => {

cache.get(cacheRef, function(error, data) {

if (error) throw error

if (!!data) {

// Stored value, grab from cache

resolve(JSON.parse(data))

}

else {

base('YOUR_TABLE_NAME').find(recordId, function(err, record) {

if (err) {

console.error(err)

reject({ err })

}

const airtablePost = {

title: record.get('Title'),

content: record.get('Content'),

publish_date: record.get('Publish Date')

}

// Store results in Redis, expires in 30 sec

cache.setex(cacheRef, 30, JSON.stringify(airtablePost));

resolve(airtablePost)

})

}

})

})

}

Now caching should be implemented — we don’t have to do anything to our front-end or our internal URLs. They still parse the JSON returned whether it’s from the cache or from Airtable. You can test obviously by loading a page, changing it in Airtable, and then making sure it doesn’t update on the front-end until 30 seconds has passed from the time of the intial page load.

Note: Now With Redis caching installed you will need to run redis-server along side npm run dev in a separate Terminal window.

Production Caching

I run my site on Heroku. I used the Redis To Go add-on which was very easy to implement. Just install the add-on and then change your cache.js to this:

const redis = require('redis')

if (process.env.REDISTOGO_URL) {

const rtg = require('url').parse(process.env.REDISTOGO_URL)

redis.createClient(rtg.port, rtg.hostname)

redis.auth(rtg.auth.split(':')[1])

module.exports = redis

}

else {

const client = redis.createClient()

// Log any errors

client.on('error', function(error) {

console.log('Error:')

console.log(error)

})

module.exports = client

}

And it should work. Installing the add-on creates the environment variables and everything else you need for it to work.

9 JSON Feed

You may remember last year a new content syndication format was announced called JSON Feed. Like an RSS or Atom feed, but instead of XML JSON!

We already have our internal URLs that retrieve and deliver data as JSON, so creating a JSON feed is pretty easy. We’ll create another URL — /feed/json — and just add the extra details in the returned JSON object to match the JSON Feed spec.

First we’ll need to import a different Markdown module for server-side rendering. Add this near the top of server.js:

// Markdown support for JSON feed

const MarkdownIt = require('markdown-it')

const md = new MarkdownIt()

And then here’s the code for the server.js route:

// JSON Feed

server.get('/feed/json', (req, res) => {

Promise.resolve(getAirtablePosts('YOUR_BASE_ID')).then(data => {

const jsonFeed = {

"version": "https://jsonfeed.org/version/1",

"home_page_url": "https://yourdomain.com/",

"feed_url": "https://yourdomain.com/feed/json",

"title": "YOUR SITE TITLE",

"description": "YOUR SITE DESCRIPTION",

"items": [

]

}

// Go through each item in returned array and add it to our JSON Feed object

data.map((item) => {

jsonFeed.items.push({

"id": `https://yourdomain.com/post/${item.id}/${item.slug}`,

"url": `https://yourdomain.com/post/${item.id}/${item.slug}`,

"title": item.title,

"content_html": !!item.content ? md.render(item.content) : '',

"date_published": item.publish_date,

"author": {

"name": "YOUR NAME"

}

})

})

res.writeHead(200, {'Content-Type': 'application/json'})

return res.end(JSON.stringify(jsonFeed, null, 2))

}).catch((error) => {

console.log(error)

// Send empty JSON otherwise page load hangs indefinitely

res.writeHead(200, {'Content-Type': 'application/json'})

return res.end(serialize({}))

})

})

The extra parameters with the JSON.stringify() function returned at the end tells the JSON parser to use new lines and character spacing to display the data rather than return it all as a single chunk of text. This is what gives us the nice, human-friendly output when you directly to the feed URL. The Syntax section on Mozilla’s JSON documentation has the details.

Once your site is live (that is, not localhost) there is a JSON Feed Validator where you can test your new JSON Feed.

Full Code Samples

Again, I’ve put all of the code talked about here in a GitHub repository. I tried to keep it platform agnostic so there’s no Heroku-specific stuff, etc. I went ahead and put it in the most recent Next, Node, and Express install though. If you run into problems following this tutorial or have other questions you can file an issue there and I’ll do my best to address it.

Updates

11/30/18

The Node packages in Section 1 were updated to their latest versions as of November 30, 2018.

1/15/20

The Node packages in Section 1 were updated to their latest versions as of January 15, 2020. Also, the initial dotenv call in server.js was updated from require('dotenv').load() to require('dotenv').config() to fix an error. Also, if you missed the note at the top this tutorial has been retired (see note at top).

Other Tutorials