Clouds to Code: The Tale of .Grib File Automation
Crafting a Weather Forecast App with the MERN Stack (MongoDB, Express, React, Node.js) ... Part 5
Automating the Process
In the previous blog post, we converted .grib
files using grib2json
. Using bilinear interpolation, we enhanced accuracy and then stored the data in MongoDB.
In this part, we'll amalgamate all processes ranging from downloading the files to storing forecast values into a unified updateForecast()
function. We'll also incorporate the npm package cron
to automate the scheduling of the updateForecast()
function, eliminating manual invocations.
Cron for the Win
Let's begin by installing the npm package cron
. Here's a brief introduction for those unfamiliar with cron jobs or similar scheduling utilities: Ever wondered how to instruct a server or computer to execute a specific command or program at a designated time? Cron is the solution.
There are two routes to choose from:
Use Cron on your server to execute your Node.js program as per a set schedule.
Employ the npm package
cron
within your Node.js program to initiate your chosen function on a timetable.
We'll opt for the second route.
To install the npm package, execute npm install cron
within the root directory of our project.
npm install cron
Getting Chummy with Cron
Initiating cron is straightforward. We simply need to extract CronJob
from cron
and then set up a new CronJob. Create a constant named job
and link a new CronJob()
to it. This function accepts several arguments:
Cron Pattern: The pattern '*/30 ' triggers our function every 30 minutes. Refer to crontab.guru to customize your pattern.
Target Function: The
UpdateDatabase()
function, which will be discussed shortly.Post-Job Action: We'll set this as null for now.
Auto-start: If true, the job begins upon initialization. Otherwise, manually start the job using
job.start()
.Timezone: For our current setup running every 30 minutes, this is optional. However, you might retain it for future flexibility.
// src/index.js
const { CronJob } = require('cron');
...
const job = new CronJob(
'*/30 * * * *',
updateDatabase(),
null,
true,
'Europe/Berlin',
);
Let's Jazz Up Our updateDatabase()!
Now that our CronJob is established, we'll flesh out the updateDatabase()
function. Start by fashioning a folder and a corresponding index.js
file.
mkdir src/updateDatabase
touch src/updateDatabase/index.js
Stepping Through the Update
The updateDatabase()
function accepts a forecastName
parameter. It begins by purging outdated forecast files potentially lingering in the folder. Next, it fetches the forecastInfo
from the database by seeking the forecastName
. The subsequent step is downloading, using the latest forecast timestamp or defaulting to undefined if no forecastInfo
exists. If there's a glitch, the function aborts.
Once downloaded, the function enumerates the acquired files, arranges them by forecast values, and converts them en masse. As a finishing touch, it erases all files, concluding the procedure.
// ./src/uodateDatabase.js
const fs = require('fs');
const { dataValues } = require('../config');
const { downloadFiles } = require('../ftp');
const { convertGrib } = require('../convert_grib');
const { ForecastInfo } = require('../models');
...
const updateDatabase = async (forecastName) => {
console.log('delete old files');
await deleteFiles(getFiles('./grib_data'));
console.log('deleted old files');
const forecastInfo = await ForecastInfo.findOne({ name: forecastName });
console.log('get files');
const newForecastTime = await downloadFiles(
forecastInfo ? forecastInfo.time : undefined,
);
if (!newForecastTime) {
return false;
}
console.log('download complete');
console.log('update Database');
const files = getFiles('./grib_data');
const sortedFiles = dataValues.map((value) => sortFiles(files, value));
await convertAllGrib(sortedFiles);
console.log('updated Database');
console.log('delete files');
await deleteFiles(getFiles('./grib_data'));
console.log('deleted files');
return true;
};
...
Nuts and Bolts: Understanding Helper Functions
You might have observed a suite of helper functions designed for code organization. Let's dissect them.
Fetching Files: The getFiles
function simply retrieves files using fs.readdirSync()
, filtering out system files.
Sorting: This function filters a list of files based on a value present in the filename.
File Deletion: Once all operations are completed, we prepare for a fresh cycle by erasing all files using fs.unlink()
. We've harnessed the power of Promise.all
for synchronous operation.
Batch Conversion: We've mapped every return value of convertGrib
into an array, leveraging Promise.all
for efficiency. This method is more performance-oriented than processing each conversion serially.
Conclusion
In the last five parts, we delved deeply into automating the process of handling .grib files. By implementing a comprehensive updateDatabase()
function, we have streamlined the journey from downloading forecast files to storing their values. With the integration of the cron
npm package, we have added an essential layer of automation, ensuring our function runs at specific intervals without manual intervention.
Our modular approach, reflected in our use of helper functions, not only maintains code readability but also ensures that each aspect of our application — from file retrieval and sorting to deletion and batch conversion — is optimized for performance.
From here, you can construct any MongoDB stack on top of this foundation. In the next episode, we will begin by building a compact Express server for our forecast App.
Your comments and feedback are greatly appreciated. Stay with me as we continue this journey to create a weather forecasting app based on .grib files.
-Tobias
CODE:
// ./src/uodateDatabase.js
const fs = require('fs');
const { dataValues } = require('../config');
const { downloadFiles } = require('../ftp');
const { convertGrib } = require('../convert_grib');
const { ForecastInfo } = require('../models');
const getFiles = (filePath) => {
const files = fs.readdirSync(filePath);
// remove hidden files from fileList
return files.filter((file) => !file.startsWith('.'));
};
const sortFiles = (files, value) => {
const regex = /(?<=_[0-9]+_[0-9]+_[0-9]+_)[A-Za-z]+(?=.grib)/;
return files.filter((file) => file.match(regex)[0] === value);
};
const deleteFiles = async (files) => {
if (!files) {
return;
}
const unlinkPromises = files.map((file) =>
fs.promises.unlink(`./grib_data/${file}`),
);
await Promise.all(unlinkPromises);
};
const convertAllGrib = async (filesList) => {
const convertPromises = filesList.map((files) =>
convertGrib(files, './grib_data'),
);
await Promise.all(convertPromises);
};
const updateDatabase = async (forecastName) => {
console.log('delete old files');
await deleteFiles(getFiles('./grib_data'));
console.log('deleted old files');
const forecastInfo = await ForecastInfo.findOne({ name: forecastName });
console.log('get files');
const newForecastTime = await downloadFiles(
forecastInfo ? forecastInfo.name : undefined,
);
if (!newForecastTime) {
return false;
}
console.log('download complete');
console.log('update Database');
const files = getFiles('./grib_data');
const sortedFiles = dataValues.map((value) => sortFiles(files, value));
await convertAllGrib(sortedFiles);
console.log('updated Database');
console.log('delete files');
await deleteFiles(getFiles('./grib_data'));
console.log('deleted files');
return true;
};
module.exports = {
updateDatabase,
};
// src/index.js
const mongoose = require('mongoose');
const { CronJob } = require('cron');
require('dotenv/config');
const { updateDatabase } = require('./updateDatabase');
mongoose.connect(process.env.MONGODB_URI, {
useUnifiedTopology: true,
useNewUrlParser: true,
});
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'mongo connection error'));
const job = new CronJob(
'*/30 * * * *',
updateDatabase('grib-d2'),
null,
true,
'Europe/Berlin',
);