Functions are blocks of code that run on demand without the need to manage any infrastructure. Develop on your local machine, test your code from the command line (using doctl
), then deploy to a production namespace or App Platform — no servers required.
To replicate the same level of functionality as a full stack application, Jamstack websites rely on a combination third-party APIs and serverless functions to handle CRUD functionality.
In this part of the tutorial series, you:
doctl
CLI.To complete this part of the tutorial, you need:
serverless-jamstack
repo into your GitHub account and then clone it to your local machine.doctl
, the official DigitalOcean CLI.npm
package manager.In the previous part of this tutorial, you copied the database’s connection string. In this step you use these elements to configure the app’s environment variables that it requires to access the MongoDB database.
Similar to most application structures, you can set up a .env
file in the root directory of your project to contain your app’s environment variables. When you deploy your functions, the DigitalOcean Functions service automatically copies the values in the .env
file and makes them available to your functions in the cloud.
To set up the environment variables, create a .env
file in the root directory of the prerequisite repo using the nano
text editor or your preferred text editor:
nano .env
In the file, paste the following variable into it, replacing the placeholder values with your database’s connection string.
DATABASE_URL=your_databases_connection_string
The resulting file should look something like this:
DATABASE_URL=mongodb+srv://doadmin:<your-password>@serverless-jamstack-61a61ac5.mongo.ondigitalocean.com/admin?authSource=admin&replicaSet=do-coffee
Once you’ve defined the variable, save the file and close it.
Next, you need to add Node.js functions to the app that connect to the MongoDB database cluster and retrieve and post data to the database.
To add functions to your app, you need to create a special directory called packages
in the app’s root directory. The packages
directory is where you add and organize your functions, and it requires a specific structure to ensure that your functions deploy correctly.
serverless-jamstack/
├── packages
│ └── cloud
│ ├── getCoffee
│ │ ├── index.js
| | ├── package-lock.json
│ │ └── package.json
│ │
│ └── postEmail
│ ├── index.js
| ├── package-lock.json
│ └── package.json
│── project.yml
└── .env
Each child directory in the packages
directory represents a single package, and each child directory of a package represents a single function. packages
can contain an arbitrary number of packages, and each package can contain an arbitrary number of functions.
In this tutorial, the app contains a package directory called cloud
which contains two functions: getCoffee
and postEmail
. Use the following set of mkdir
commands to create a packages
directory with the following child directories required for the sample functions:
mkdir -p packages/cloud/getCoffee; mkdir packages/cloud/postEmail
The -p
flag allows you to create the parent directories, packages
and cloud
, in tandem with the getCoffee
and postEmail
child directories.
After you’ve set up the package directory, navigate to the getCoffee
directory to add your first function.
You can think of each function as its own self-contained Node.js project. This means that the package.json
file your function relies on should be in the same directory as your function.
To initialize the getCoffee
directory for a Node.js project, run:
npm init -y
This creates the package.json
file that Node.js uses to track the project’s dependencies and attributes. The -y
flag shortens the initialization process by skipping several user input prompts that are not required to complete this tutorial.
The functions in this tutorial require the mongodb
module, MongoDB’s official Node.js client. This allows your functions to connect, retrieve, and post data to the database.
Because functions can be deployed and tested from the cloud, you do not need to install the modules locally. Instead, you can use --package-lock-only
flag to update the package.json
file with the function’s required dependencies without installing them. To update the package.json
file with this function’s dependencies, run:
npm install --package-lock-only mongodb
After updating the function’s dependencies, create a file called index.js
in the getCoffee
directory. This file contains your function’s code.
nano index.js
In the text editor, paste the following code into the index.js
file:
// The function's dependencies.
const MongoClient = require('mongodb').MongoClient;
// Function starts here.
async function main() {
// MongoDB client configuration.
const uri = process.env['DATABASE_URL'];
let client = new MongoClient(uri);
// Instantiates a connection to the database and retrieves data from the `available-coffee` collection
try {
await client.connect();
const inventory = await client.db("do-coffee").collection("available-coffees").find().toArray();
console.log(inventory);
return {
"body": inventory
};
} catch (e) {
console.error(e);
return {
"body": { "error": "There was a problem retrieving data." },
"statusCode": 400
};
} finally {
await client.close();
}
}
// IMPORTANT: Makes the function available as a module in the project.
// This is required for any functions that require external dependencies.
module.exports.main = main;
This function configures the MongoDB client with the database’s credentials, sends a request to the database, and then returns the response.
The function’s comments outline its functionality in detail but there are two important pieces to note:
The last line (module.exports.main = main
) exports the function as a Node.js module. If a function relies on any external dependencies, you must export the function as a module for it run to correctly on the DigitalOcean Functions service.
If your function is returning something, the return value must be included in a response body like this:
return {
"body": your_data
}
After pasting the code into the file, save the file and close it.
Now repeat the same steps for the packages/cloud/postEmail
directory using the same set of dependencies and the following code for the index.js
file:
const MongoClient = require('mongodb').MongoClient;
async function main(args) {
const uri = process.env['DATABASE_URL'];
let client = new MongoClient(uri);
let newEmail = args.email;
try {
await client.connect();
await client.db("do-coffee").collection("email-list").insertOne({subscriber: newEmail});
console.log(`added ${newEmail} to database.`);
return { ok: true };
} catch (e) {
console.error(e);
return {
"body": { "error": "There was a problem adding the email address to the database." },
"statusCode": 400
};
} finally {
await client.close();
}
}
module.exports.main = main;
DigitalOcean Functions requires a YAML specification file named project.yml
in the root folder of the app. The project.yml
file is a manifest that lists each function in the app’s packages
directory and makes the service aware of any environment variables.
Create a project.yml
file in the root folder:
nano project.yml
Paste the following code into the project.yml
file:
packages:
- name: cloud
actions:
- name: getCoffee
limits:
timeout: 5000
memory: 256
- name: postEmail
limits:
timeout: 5000
memory: 256
environment:
DATABASE_URL: ${DATABASE_URL}
This project.yml
file declares a package named cloud
with two functions (actions
) in it: getCoffee
and postEmail
. The environment
stanza declares one environment variable in the global scope of the packages
directory.
Once you’ve added the code to the file, save the file and close it.
Once you have added the functions to their respective directories, updated their package.json
files with their dependencies, and configured the project.yml
file, you can deploy the functions to DigitalOcean and test them from the command line using doctl
.
To deploy the functions, start by connecting to the development namespace:
doctl serverless connect
The development namespace is where you can test functions in the cloud before deploying them to App Platform.
Once connected, deploy the function by running the following command from the app’s root directory:
doctl serverless deploy .
A successful deploy returns output that looks like this:
Deployed functions ('doctl sbx fn get <funcName> --url' for URL):
- cloud/getCoffee
- cloud/postEmail
Finally, you can test a function by running the functions invoke
command:
doctl serverless functions invoke cloud/getCoffee
The command returns the sample JSON data from the do-coffee
database.
To retrieve the function’s URL, use the serverless functions get
command with the --url
flag:
doctl serverless functions get cloud/getCoffee --url
You can copy and paste the returned URL into your local browser and see how the function returns the JSON data similar to how a traditional API endpoint would.
In this part of the tutorial series, you:
.env
file.packages
directory to house the functions.project.yml
file.doctl
.In the next part of this tutorial series, you use each function’s URL to connect the static HTML website to the functions, which allows the website to connect to the database.