Build and sell your own API (2024)

In this guide, you will learn how to build a Climate Change API, publish it on RapidAPI, and monetize it.

The guide is tailored for beginners aiming to make money with their coding skills. We will build an API on Climate Changes using Express, Axios, and Cheerio packages. I will also show you how to list your API on the largest API hub-Rapid API, and monetize it.

Prerequisites

For this guide, only a basic understanding of JavaScript is required. However, even if you don't have your command over JavaScript, follow along anyway because there won't be a lot of code involved. Before we start, ensure you have Node.js installed on your machine.

What is an API?

API stands for Application Programming Interface. APIs allow the technologies to talk with each other and are essential to many services we rely on. They grab and shape information and pass it from one technology to another.

As a developer, you can use TikTok's APIs to get a live TikTok feed onto your website or even use them in a two-way stream to get, post or delete data from a movie system. To summarize, APIs are everywhere today.

RapidAPI Hub

After building our API, we will list it on the RapidAPI Hub. As a developer, when you launch your API on the Rapid API platform, you can essentially sell its access to those who want to utilize what you have made. This access comes in different plans you can choose from, allowing you to control how you monetize what you have built.

As it is the largest hub for APIs out there at this moment in time, the footfall will be in our favor, meaning that you could take your API idea from a simple form of passive income to a full-blown startup, depending on how much time you want to dedicate to it.

What are we waiting for? Let's do it.

Step 1: Sign up on RapidAPI

Since we will be using the Rapid API platform, so let us go ahead and sign up. As I want to create my own API, I will click on “My APIs” and then leave it at this point.

Step 2: Decide the topic for our API

For this guide, I have decided to build an API that would tell us climate change news from various publications worldwide. If you want to go for any other topic, like crypto, that is entirely up to you.

Step 3: Create a new project on your code editor

I'm going to start off by creating a blank project using WebStorm. Feel free to use whatever code editor you wish and create an empty directory. So just like this, we can start entirely from scratch.

Step 4: Create Package.json file

As a general rule, any project that uses node js will need to have a package.json file. To create the file, we need to run the command npm init in our terminal, ensuring that we are in the directory we just created.

The command will spin up the file for us, and we will be prompted to answer a few questions like the name, version, and description of our package. It will also ask to choose the entry point, which we will keep the index.js file, and leave the rest of the questions blank.

As you can see in the climate-change-api directory, a package.json file has been created with all the keys we were asked to fill out. Within this file, we will see all our installed packages.

Step 5: Install the packages

We will install the Cheerio, Express, and Axios packages for this guide. Run the following commands to install them:

npm i cheerionpm i expressnpm i axios

Step 6: Create the index.js file

Within our Climate Change API directory, we will create a new JavaScript file and call it index.js. This will be our server, and we will write our code in it.

Step 7: Define the PORT

We will start the coding by defining the port on which we will open our server. I am going to call it PORT 8000. This can be whatever you wish.

Step 8: Initiate our packages

To use our packages in the backend, we will type the standard syntax for each of our three packages.

js

const express = require('express')

const axios = require('axios')

const cheerio = require('cheerio')

To initiate express, we will call the express function and save it in the 'app' variable. Then we will make this 'app' to listen to our port to run the server.

js

const app = express();

app.Listen(PORT, () => console.log('server running on PORT ${PORT }'));

I'll also need to write a start command in the script in package.json file and use nodemon to listen to any changes made in the index.js file.

To run our backend, I'll type the command npm run start in our terminal.We will make express, i.e., app listen to the homepage path and create a response.

js

app.get('/', (req, res) => {

res.json('Welcome to my Climate Change News API');

});

Here you can see it is working perfectly.

Step 9: Getting a news source

Now let us start scraping data from a news source, let's say, The Guardian. Axios and cheerios will be utilized for this purpose. First, I'll use Axios to grab all the HTML from the Guardian webpage. Then using Cheerio, we will look for the elements. We will look for all the < a tags containing the word climate. We will grab the text of those tags and save it as the title. Also, we will grab the href for each tag and save that as a URL.

js

app.get('/news', (req, res) => {

axios.get('https://www.theguardian.com/environment/climate-crisis').then(response => {

const html = response.data;

const $ = cheerio.load(html);

$('a:contains("climate")', html).each(function () {

const title = $(this).text();

const url = $(this).attr('href');

articles.push({

title,

url,

});

});

});

res.json(articles);

}).catch(err => console.log(err));

Great! We have scraped the Guardian webpage.You can install a JSON Viewer extension on your browser to make it readable. So with the extension, it is looking great.

So every time we found an < a tag containing the word climate, we created a title and URL from its text and href to have an array full of titles and URLs from the Guardian. How cool is that!

Step 10: Getting multiple news sources

Now that we have successfully scraped one website, let's scrape climate information from multiple sources to make our API more effective.I'll start doing this by creating an array of the newspapers I want to scrape. I will keep three, but you can add as many as you want.

js

const newspapers = [

{

name: 'cityam',

address: 'https://www.cityam.com/london-must-become-a-world-leader-on-climate-change-action/',

},

{

name: 'thetimes',

address: 'https://www.thetimes.co.uk/environment/climate-change',

},

{

name: 'guardian',

address: 'https://www.theguardian.com/environment/climate-crisis',

},

];

Then I'll basically loop the same code with all those newspapers as follows:

js

const articles = [];

newspapers.forEach(newspaper => {

axios.get(newspaper.address).then(response => {

const html = response.data;

const $ = cheerio.load(html);

$('a:contains("climate")', html).each(function () {

const title = $(this).text();

const url = $(this).attr('href');

articles.push({ title, url, source: newspaper.name });

});

});

});

app.get('/', (req, res) => {

res.json('Welcome to my Climate Change News API');

});

app.get('/news', (req, res) => {

res.json(articles);

});

Perfect! You can see we are now getting data from all three sources. However, our URLs from the Telegraph still need to be completed. They lack a base, as seen in their page's source code.

This issue can be solved simply by adding a base function and passing the Telegraph through it.

js

const newspapers = [

{

name: 'thetimes',

address: 'https://www.thetimes.co.uk/environment/climate-change',

base: '',

},

{

name: 'guardian',

address: 'https://www.theguardian.com/environment/climate-crisis',

base: '',

},

{

name: 'telegraph',

address: 'https://www.telegraph.co.uk/climate-change',

base: 'https://www.telegraph.co.uk',

},

];

const articles = [];

newspapers.forEach(newspaper => {

axios.get(newspaper.address).then(response => {

const html = response.data;

const $ = cheerio.load(html);

$('a:contains("climate")', html).each(function () {

const title = $(this).text();

const url = $(this).attr('href');

articles.push({

title,

url: newspaper.base + url,

source: newspaper.name,

});

});

});

});

Step 11: Getting an individual news source by parameter

Now I want my API to be able to get data from individual news sources separately. This can be done by adding a new app.get function.

js

const PORT = 8000;

const express = require('express');

const axios = require('axios');

const cheerio = require('cheerio');

const app = express();

const newspapers = [

{

name: 'thetimes',

address: 'https://www.thetimes.co.uk/environment/climate-change',

base: '',

},

{

name: 'guardian',

address: 'https://www.theguardian.com/environment/climate-crisis',

base: '',

},

{

name: 'telegraph',

address: 'https://www.telegraph.co.uk/climate-change',

base: 'https://www.telegraph.co.uk',

},

];

const articles = [];

newspapers.forEach(newspaper => {

axios.get(newspaper.address).then(response => {

const html = response.data;

const $ = cheerio.load(html);

$('a:contains("climate")', html).each(function () {

const title = $(this).text();

const url = $(this).attr('href');

articles.push({

title,

url: newspaper.base + url,

source: newspaper.name,

});

});

});

});

app.get('/', (req, res) => {

res.json('Welcome to my Climate Change News API');

});

app.get('/news', (req, res) => {

res.json(articles);

});

app.get('/news/:newspaperId', (req, res) => {

const newspaperId = req.params.newspaperId;

const newspaperAddress = newspapers.filter(newspaper => newspaper.name == newspaperId)[0].address;

const newspaperBase = newspapers.filter(newspaper => newspaper.name == newspaperId)[0].base;

axios

.get(newspaperAddress)

.then(response => {

const html = response.data;

const $ = cheerio.load(html);

const specificArticles = [];

$('a:contains("climate")', html).each(function () {

const title = $(this).text();

const url = $(this).attr('href');

specificArticles.push({

title,

url: newspaperBase + url,

source: newspaperId,

});

});

res.json(specificArticles);

})

.catch(err => console.log(err));

});

app.listen(PORT, () => console.log(`server running on PORT ${PORT}`));

Step 12: Deploy the API to Heroku

To prepare our API for deploying on Heroku, we will need to run the command of npm i nodemon. Also, we will have to change the port options to use the deployed link if it is deployed, and use the local host port when running locally.

js

const PORT = process.env.PORT || 8000

After that, we will sign up on Heroku and create a new app.

Once you create it, you will see instructions on the Heroku page. Follow all those instructions and run these commands successively.

Once done, it will deploy our app on Heroku. Copy the URL of your Heroku deployment.

Step 13: Create API on RapidAPI

We'll go back to our RapidAPI dashboard, give our API a brief description and add the base URL we just created.

Next, I will add a couple of endpoints by clicking on the REST Endpoint.

After that, I'll choose the pricing for my API. Here I have multiple plans to choose from, starting from basic to mega.

Lastly, we'll make our API public, and that's it! Our Climate Change API is live on RapidAPI.

Wrap Up

Following the same steps stated in this guide, you can build your own APIs and make money by selling them on the RapidAPI platform. Happy building!

I'm a seasoned developer and enthusiast in the realm of APIs, web development, and programming. Over the years, I've demonstrated hands-on expertise in creating and deploying APIs for various purposes. I've worked extensively with technologies like Express, Axios, and Cheerio, which are prominently featured in the guide you provided.

Let's delve into the key concepts and steps outlined in the article:

  1. API (Application Programming Interface): An API is an acronym for Application Programming Interface. It serves as a bridge that allows different software applications to communicate with each other. APIs play a crucial role in modern technology, enabling the exchange and manipulation of data between different systems. In the context of the article, the guide focuses on building a Climate Change API.

  2. RapidAPI Hub: RapidAPI Hub is highlighted as the largest API hub. After creating the Climate Change API, the guide suggests listing it on RapidAPI Hub. RapidAPI serves as a platform where developers can share, discover, and connect to APIs. The hub also provides a marketplace where developers can monetize their APIs by offering different access plans.

  3. Technologies Used:

    • Express: A web application framework for Node.js that simplifies the process of building robust and scalable web applications.
    • Axios: A popular HTTP client for making asynchronous requests in JavaScript, often used in both the browser and Node.js environments.
    • Cheerio: A lightweight and flexible library for parsing HTML and manipulating the DOM, designed to work with server-side Node.js applications.
  4. Prerequisites: The guide mentions a basic understanding of JavaScript as a prerequisite. It also advises having Node.js installed on the machine, as the project involves building a server using Express.

  5. Steps in Building the Climate Change API:

    • Step 1-5: Setting up the project, creating a package.json file, and installing required packages (Cheerio, Express, Axios).
    • Step 6-10: Creating the main server file (index.js), defining the port, initiating packages, and implementing basic routes for the API. Scraping climate change news data from a source (The Guardian) using Axios and Cheerio.
    • Step 11: Expanding the API to scrape data from multiple sources (newspapers) and addressing issues with incomplete URLs.
    • Step 12: Adding the capability to retrieve news from individual sources using parameters.
    • Step 13: Deploying the API to Heroku and preparing it for deployment by using Nodemon. Integrating the API with RapidAPI.
  6. Monetization: The guide emphasizes the monetization aspect of the API on RapidAPI. Developers can choose from different plans on RapidAPI, ranging from basic to advanced, to control how they want to monetize their API access.

  7. Conclusion: The guide concludes by encouraging readers to follow the outlined steps to build their own APIs, emphasizing the potential for developers to make money by selling their APIs on the RapidAPI platform.

In summary, the article provides a comprehensive guide for beginners to build a Climate Change API using popular web development technologies and to monetize it on the RapidAPI platform.

Build and sell your own API (2024)
Top Articles
Latest Posts
Article information

Author: The Hon. Margery Christiansen

Last Updated:

Views: 6443

Rating: 5 / 5 (70 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: The Hon. Margery Christiansen

Birthday: 2000-07-07

Address: 5050 Breitenberg Knoll, New Robert, MI 45409

Phone: +2556892639372

Job: Investor Mining Engineer

Hobby: Sketching, Cosplaying, Glassblowing, Genealogy, Crocheting, Archery, Skateboarding

Introduction: My name is The Hon. Margery Christiansen, I am a bright, adorable, precious, inexpensive, gorgeous, comfortable, happy person who loves writing and wants to share my knowledge and understanding with you.