Bridging Horizons: Restless-API with Express

Bridging Horizons: Restless-API with Express

Crafting a Weather Forecast App with the MERN Stack (MongoDB, Express, React, Node.js) ... Part 6

In the previous blog post, we successfully assembled the backend, allowing for the download, conversion, and storage of forecast data. Over the last three months, I dedicated time to refactoring a significant portion of the codebase, consequently enhancing the backend's functionality. Presently, it boasts support for four distinct forecast models and the capability to store wind forecasts for each time slot as a map.

To seamlessly integrate the backend with our feature-rich frontend, we need to establish an API. We'll accomplish this task using Express. I will guide you through setting up the project, installing crucial dependencies, and crafting a well-organized structure. From enhancing database models to creating responsive controllers and routes, we're laying the groundwork for a seamless integration.

Setting Up the Project Structure

Let's kick off the project by creating a dedicated folder and navigating into it. Subsequently, we'll initialize a new npm package to manage our dependencies.

mkdir windspotter-api
cd windspotter-api
npm init -y

This lays the foundation for our API project, providing us with a clean and organized structure to build upon.

Installing Essential Dependencies

In the next phase, let's install the fundamental dependencies required for our application. Express and mongoose are fundamental for building the API, while dotenv will aid in managing environment variables.

npm i express mongoose dotenv

Additionally, we'll integrate Express-validator to validate the data received in request bodies and Cors to facilitate communication between our backend and frontend, especially since they might be hosted on different servers.

npm i express-validator cors

Configuration Files

With our dependencies installed, it's time to set up the configuration files. Create a .gitignore file to ensure sensitive information like the MongoDB connection string and the node_modules folder are not exposed. Also, establish a .env file to store environment-specific variables. While I won't delve into the specifics of these files here, it's crucial to safeguard confidential information.

Feel free to use the following commands to create these files:

touch .gitignore
touch .env

Make sure to include the relevant entries in the .gitignore file and place your MongoDB connection string in the .env file. Keeping these files secure is paramount for maintaining the confidentiality of sensitive information.

Organizing the Project Structure

Maintaining a clean and organized project structure is crucial for readability and scalability. Let's set up a simple and well-structured layout for our project:

mkdir src src/controllers src/models src/routes 
touch src/index.js
touch src/controllers/spotController.js
touch src/models/index.js src/models/spot.js
touch src/routes/index.js src/routes/spot.js

Here's a brief overview of the purpose of each directory and file:

  • src: This is the main source folder.

  • src/controllers: Contains controllers responsible for handling logic.

  • src/models: Houses database models.

  • src/routes: Manages the routes for our Express application.

  • src/index.js: Serves as an entry point for our application.

  • src/controllers/spotController.js: The controller file for managing spots

  • src/models/index.js: An index file for managing and exporting all database models.

  • src/models/spot.js: The model file defining the structure of the Spot entity.

  • src/routes/index.js: An index file for managing and exporting all routes.

  • src/routes/spot.js: The route file defining endpoints related to spots.

This structure enhances code organization and simplifies import statements, promoting a cleaner and more maintainable codebase.

Enhancing Package.json

Let's make some final adjustments to your package.json file, including adding proper start scripts and incorporating nodemon as a development dependency:

npm i --save-dev nodemon
// ./package.json
...
"scripts": {
    "start": "node ./src/index.js",
    "devstart": "nodemon /src/index.js",
    "serverstart": "DEBUG=windspotter-api:* npm run devstart"
  },
...

Key changes include:

  • scripts section: Added a devstart script using nodemon for an improved development experience.

  • devDependencies section: Added nodemon as a development dependency.

Now, running npm run devstart will launch your application with nodemon, providing automatic restarts upon file changes during development.

Refining the Spot Model

To seamlessly integrate our database models into the API, let's begin by migrating the spot.js file from our backend app in the previous posts. If you're joining directly from Part 5, you might notice a subtle alteration in the model. As part of ongoing refinements to our backend app, I've made adjustments to the database model to better align with our objectives. Let's quickly delve into the new fields.

One notable addition is the windDirections array, designed to store a series of booleans. These booleans will represent wind directions in a specific order. This enhancement serves a practical purpose—I intend to implement a feature indicating whether a forecast is suitable for activities such as windsurfing or kitesurfing at a given spot. Consequently, a dedicated database is needed for each spot to catalog wind directions conducive to surfing.

The introduction of the searchName field provides an efficient way to make spots searchable via the URL. Given that some spot names contain special characters or spaces, selecting a URL-friendly search name allows for smooth navigation, while the original display name remains unchanged.

Let's take a look at the revised spot.js model:

// models/spot.js
const mongoose = require('mongoose');
const { Schema } = mongoose;

const SpotSchema = new Schema({
  name: { type: String, required: true, maxLength: 100 },
  searchName: { type: String, required: true, maxLength: 100 },
  lat: { type: Number, required: true },
  lon: { type: Number, required: true },
  forecasts: [{ type: Schema.Types.ObjectId, ref: 'Forecast' }],
  windDirections: [{ type: Boolean }],
});

module.exports = mongoose.model('Spot', SpotSchema);

This model now captures the nuanced elements we need for a comprehensive and efficient representation of our spots in the API. Feel free to explore and adapt it to suit your specific project requirements.

Unveiling the Forecast Model

Our journey into the intricacies of our models continues with the Forecast Model, a robust entity that boasts a myriad of additional fields to accommodate various prediction models and values. The time field, a Date Object, takes center stage as it represents the last update time of the forecast, providing crucial temporal context. As we delve deeper into this model, we'll explore how these fields harmonize with our diverse data requirements, setting the stage for building the controller in the upcoming sections.

// models/forecast.js
const mongoose = require('mongoose');
const { Schema } = mongoose;

const ForecastSchema = new Schema({
  forecastInfo: { type: Schema.Types.ObjectId, ref: 'Forecast' },
  time: { type: Date, required: true },
  t_2m: { type: Object },
  v_10m: { type: Object },
  u_10m: { type: Object },
  vmax_10m: { type: Object },
  clct_mod: { type: Object },
  rain_gsp: { type: Object },
  mwd: { type: Object },
  swh: { type: Object },
  tm10: { type: Object },
  tmp: { type: Object },
  vgrd: { type: Object },
  ugrd: { type: Object },
  tcdc: { type: Object },
  pers: { type: Object },
  crain: { type: Object },
});

module.exports = mongoose.model('Forecast', ForecastSchema);

This comprehensive model is tailored to accommodate a wide range of data, ensuring flexibility and scalability as we progress through the development of our controller.

Enhancing the ForecastInfo Model

The ForecastInfo Model, a cornerstone in our data architecture, remains largely consistent with a few strategic augmentations. Two variables—Nx and Ny—have been seamlessly integrated from the header to optimize frontend calculations, ensuring efficiency. While these variables could be calculated dynamically, the decision to store them directly from the grib header adds a practical dimension to our implementation.

// models/forecastinfo.js
const mongoose = require('mongoose');
const { Schema } = mongoose;

const ForecastInfoSchema = new Schema({
  name: { type: String, required: true, maxLength: 100 },
  time: { type: Date, required: true },
  lo1: { type: Number, required: true },
  lo2: { type: Number, required: true },
  la1: { type: Number, required: true },
  la2: { type: Number, required: true },
  dy: { type: Number, required: true },
  dx: { type: Number, required: true },
  nx: { type: Number, required: true },
  ny: { type: Number, required: true },
});

module.exports = mongoose.model('ForecastInfo', ForecastInfoSchema);

Structuring Models in the Index File

Let's explore how we organize and structure our models in the index file for optimal integration into our backend system. This section provides insights into the purpose of each model, acknowledging the exclusion of the MapForecast Model for the time being. By adopting this modular approach, we ensure a clean and efficient workflow as we progress through the development of our application.

// models/index.js
const Spot = require('./spot');
const Forecast = require('./forecast');
const ForecastInfo = require('./forecastinfo');
// const MapForecast = require('./mapforecast');

module.exports = {
  Spot,
  Forecast,
  ForecastInfo,
  // MapForecast,
};

This concise index file serves as a central hub for seamlessly importing and exporting our models, setting the stage for a well-organized and scalable backend architecture. As we navigate through the subsequent sections, the focus remains on the core models essential for the functionality of our application.

Crafting Controllers for Responsive Server Interaction

As we delve into the Controllers, we encounter the pivotal logic that shapes our server's responses to incoming requests. This section begins by importing our validator and models, followed by the introduction of essential helper functions designed to enhance code efficiency.

// controllers/spotController.js
const { body, validationResult } = require('express-validator');
const { Spot, Forecast, ForecastInfo } = require('../models');

// Helper function to send error response
const sendError = (res, message) => res.status(400).json({ message });

const nameValidator = body('name')
  .trim()
  .isLength({ min: 3 })
  .withMessage('Spot name must be at least 3 chars long')
  .isLength({ max: 50 })
  .withMessage('Spot name cant be no longer than 50 chars')
  .escape();

const latValidator = body('lat')
  .trim()
  .isNumeric()
  .withMessage('Latitude must be a number')
  .escape();

const lonValidator = body('lon')
  .trim()
  .isNumeric()
  .withMessage('Longitude must be a number')
  .escape();

const getLastForecastDay = (forecast) => {
  // get the last day of the forecast and return it
  // have to convert the date string to a date object while sorting
  const lastDay = Object.keys(forecast)
    .sort((a, b) => new Date(a) - new Date(b))
    .pop();

  return lastDay;
};
...

This meticulous setup not only streamlines our error-handling mechanisms but also ensures that incoming data is thoroughly validated. The nameValidator, latValidator, and lonValidator stand as gatekeepers, enforcing specific criteria for spot names, latitudes, and longitudes, respectively.

As we proceed through the controllers, these validators will play a pivotal role in maintaining data integrity and enhancing the security of our server. The getLastForecastDay function, with its sorting mechanism, sets the stage for efficiently retrieving the latest forecast day—an indispensable asset for our application.

In the subsequent sections, we'll delve into the controllers' intricacies, where these validators and helper functions come to life, shaping the dynamic and responsive behavior of our server.

Listing Spots: Retrieving All Spots

The spotListGet function is dedicated to fetching and returning a list of all spots. Utilizing the Spot.find() method, it attempts to retrieve spots from the MongoDB database. Upon success, a response with a status code of 200 (OK) and the spots array is sent. In case no spots are found, the sendError helper function is invoked, providing an appropriate error message.

// controllers/spotController.js
...
exports.spotListGet = async (req, res) => {
  try {
    const spots = await Spot.find();
    res.status(200).json({ spots });
  } catch {
    sendError(res, 'failed to find any spots');
  }
};
...

Creating a New Spot: Handling POST Requests

The createSpotPost function manages POST requests for creating new spots. Validator functions for name, latitude, and longitude are applied to ensure data integrity. If the creation and saving process is successful, a response with a status code of 201 (Created) and the new spot object is sent. In case of an error, the sendError helper function handles the response.

// controllers/spotController.js
...
exports.createSpotPost = [
  nameValidator,
  latValidator,
  lonValidator,
  async (req, res) => {
    try {
      const spot = new Spot(req.body);
      await spot.save();
      res.status(201).json({ spot });
    } catch {
      sendError(res, 'failed to create new Spot');
    }
  },
];

...

Updating a Spot: Handling PUT Requests

The spotPut function deals with PUT requests, aiming to update spot data. Similar validation steps are taken at the beginning. Upon finding the spot to be updated, the data is modified, and the spot is saved. A response with a status code of 200 (OK) and the updated spot object is sent. If the spot is not found or an error occurs, appropriate error messages are sent.

// controllers/spotController.js
...
exports.spotPut = [
  nameValidator,
  latValidator,
  lonValidator,
  async (req, res) => {
    try {
      const spot = await Spot.findById(req.params.id);
      if (spot) {
        Object.assign(spot, req.body);
        await spot.save();
        res.status(200).json({ spot });
      } else {
        sendError(res, 'Spot not found');
      }
    } catch {
      sendError(res, 'failed to change spot');
    }
  },
];
...

These controllers form the backbone of spot operations, ensuring effective interaction with the MongoDB database and providing clear, informative responses to incoming requests. As you explore the full file appended at the end of this post, you'll notice similar logic for additional operations, with minor variations in data conversion after retrieving objects from the database.

Feel free to delve deeper into the code for a comprehensive understanding of how these controllers collectively contribute to the functionality of our application.

Setting Up Routes for Seamless Interaction

In the Spot routes (routes/spot.js), we establish a connection between specific URL paths and corresponding controller functions. This involves importing the spotController, creating a router object, and mapping URLs to controller operations. Note the use of placeholders like :id and :name in the URLs, allowing dynamic handling of variables. Once all routes are set up, the router is exported.

// routes/spot.js
const { Router } = require('express');
const spotController = require('../controllers/spotController');

const router = Router();

router.get('/list', spotController.spotListGet);
router.post('/new', spotController.createSpotPost);
router.get('/name/:name', spotController.spotByNameGet);
router.get('/:id', spotController.spotGet);
router.delete('/:id', spotController.spotDelete);
router.put('/:id', spotController.spotPut);
router.get('/:id/forecast', spotController.spotForecastGet);
router.get('/name/:name/forecast', spotController.spotForecastByNameGet);

module.exports = router;

Index Routes: Assembling Route Modules

The routes/index.js file orchestrates the overall routing structure. Here, the spot route is included, and the map route, though commented out for now, can be uncommented as needed.

// routes/index.js
const spot = require('./spot');
// const map = require('./map');

module.exports = {
  spot,
  // map,
};

Main Index File: Bringing It All Together

The main index.js file serves as the nexus where modules, routes, and server configurations converge. After setting up the MongoDB connection and creating an Express app and HTTP server, routes are imported and configured using .use(). In this example, the spot route is specified.

// src/index.js
const cors = require('cors');
const mongoose = require('mongoose');
const express = require('express');
const http = require('http');
require('dotenv/config');

const mongoDB = process.env.MONGODB_URI;
mongoose.connect(mongoDB, { useUnifiedTopology: true, useNewUrlParser: true });
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'mongo connection error'));

const app = express();
const httpServer = http.createServer(app);

const routes = require('./routes');

app.use(cors());
app.use(express.json());
app.use(express.urlencoded({ extended: true }));

app.use('/spot', routes.spot);
// app.use('/map', routes.map);

httpServer.listen(process.env.PORT, () => {
  return console.log(`api listening on port ${process.env.PORT}!`);
});

This comprehensive structure forms the backbone of your server application, seamlessly connecting routes with controllers and setting the stage for smooth interactions between the client and MongoDB database. As you expand your application, this modular approach allows for easy integration of additional routes and functionalities.

Conclusion: Forging a Robust Backend Foundation

In this backend development journey for our windspotter application, we've laid a robust foundation. From project setup, dependency installations, and configuration files to meticulous organization.

Our choice of dependencies, like Express and Mongoose, emphasizes efficiency and security. The well-organized project structure ensures clarity and scalability, evident in our spot and forecast models with unique attributes.

Controllers handle requests dynamically, offering reliability in listing spots, creating new ones, and updating entries. Routes act as vital connectors, facilitating seamless communication between URLs and controllers.

The main index file brings it all together—establishing MongoDB connections, configuring routes, and setting up the server. This modular approach positions our backend for future enhancements.

Thank you for reading, and as always, your comments and feedback are greatly appreciated. See you soon.

-Tobias

CODE: github.com/Stonehagen/windspotter-api

Did you find this article valuable?

Support Tobias Steinhagen by becoming a sponsor. Any amount is appreciated!