phpMyAdmin in Node.js: how to build a base REST API

phpMyAdmin popularized the concept of a web application that allows managing a MySQL database by providing an administration interface. Based on this example, in this article, we will develop APIs with Node.js that will allow us to perform essential operations on a MySQL instance.

Requirements

  • A MySQL installation.
  • A MongoDB installation.
  • Node.js and NPM.

Required NPM Modules

  1. express
  2. body-parser
  3. cors
  4. dotenv
  5. mongoose
  6. mysql2
  7. validator
  8. jsonwebtoken

The .env File

This configuration file will be used by the dotenv module and will provide some essential parameters for our application to function.

# .env

JWTSECRET=secret
JWTEXPIRESIN=604800
DBURI=mongodb://localhost:27017/mysql_client
HASHKEY=KvoF9FIYlg89DeiNbE4JOOlJl6GcQVTJ
HASHIV=8niXOZKtrJnSnXNC
  1. JWTSECRET: The private string used to generate a JSON Web Token that will allow access to protected routes.
  2. JWTEXPIRESIN: The number of seconds that will set the expiration for the authentication token.
  3. DBURI: The connection string to MongoDB.
  4. HASHKEY, HASHIV: Private strings for encrypting and decrypting the MySQL database connection passwords.

MongoDB Collections

In MongoDB, we will define the two essential collections for our APIs.

  1. users
  2. connections

With Mongoose, we will define the model for the users collection as follows:

// models/user.js

const { Schema, model } = require('mongoose');

const userSchema = new Schema({
    name: String,
    email: String,
    password: String,
    role: {
        type: String,
        default: 'user',
    },
    connections: [
        {
            type: Schema.Types.ObjectId,
            ref: 'Connection',
        },
    ]
});
module.exports = model('User', userSchema, 'users');

The model for the connections collection will instead be as follows:

// models/connection.js

const { Schema, model } = require('mongoose');

const connectionSchema = new Schema({
    host: String,
    port: String,
    username: String,
    password: String,
    database: String,
    user: {
        type: Schema.Types.ObjectId,
        ref: 'User',
    }
});

module.exports = model('Connection', connectionSchema, 'connections');

A user has a One-To-Many relationship with MySQL connections, each represented by an ObjectID within the connections array in the User model. Using its populate() method, Mongoose will allow us to access the various Connection documents when needed.

Remember that with Mongoose, a string representing an ObjectID will always be converted and handled in the format required by MongoDB.

Utilities

Authentication

For authentication functions, we need a middleware function to validate the JSON Web Token passed with the request and another to generate it with the data of the user who logged in.

// utils/auth.js

const jwt = require('jsonwebtoken');

const authMiddleware = (req, res, next) => {
    const header = req.headers.authorization;
    const token = header && header.split(' ')[1];
    if (!token) {
        return res.status(401).json({ message: 'Token is required' });
    }
    jwt.verify(token, process.env.JWTSECRET, (err, decoded) => {
        if (err) {
            return res.status(401).json({ message: 'Invalid token' });
        }
        req.user = decoded;
        next();
    });
};

const generateToken = (data) => {
    const expires = parseInt(process.env.JWTEXPIRESIN, 10);
    const plainData = { name: data.name, email: data.email };
    try {
        return jwt.sign(plainData, process.env.JWTSECRET, { expiresIn: expires });
    } catch (error) {
        return '';
    }
};

module.exports = { authMiddleware, generateToken };

To successfully generate a token, it is essential that the JavaScript object contains only the necessary properties, i.e., it is of plain type. In this case, the plainData object contains only the user's name and email.

Encryption and Hashing

To encrypt the MySQL database connection password and decrypt it when needed, we can use the core crypto module and the aes-256-cbc algorithm. User password hashing, on the other hand, will use the SHA-256 algorithm.

// utils/hash.js

const crypto = require('crypto');
const algorithm = 'aes-256-cbc';
const key = Buffer.from(process.env.HASHKEY);
const iv = Buffer.from(process.env.HASHIV);

const encrypt = (text) => {
    const cipher = crypto.createCipheriv(algorithm, key, iv);
    let encrypted = cipher.update(text, 'utf8', 'hex');
    encrypted += cipher.final('hex');
    return encrypted;
};

const decrypt = (encryptedText) => {
    const decipher = crypto.createDecipheriv(algorithm, key, iv);
    let decrypted = decipher.update(encryptedText, 'hex', 'utf8');
    decrypted += decipher.final('utf8');
    return decrypted;
};

const hash = (password) => {
    return crypto.createHash('sha256').update(password).digest('hex');
};

module.exports = { encrypt, decrypt, hash };

MySQL

The mysql2 module allows for connection pool management, which is helpful when we need efficient connections to multiple databases, as in our case, where a user can save multiple connections.

// utils/mysql.js

const mysql = require('mysql2/promise');
const User = require('../models/user');
const { decrypt } = require('./hash');

const getUserConnectionDetails = async (userId, databaseName) => {
    try {
        const user = await User.findById(userId).populate('connections').exec();
        const connections = user.connections.filter((connection) => connection.database === databaseName);
        return connections.length > 0 ? connections[0] : null;
    } catch (error) {
        return null;
    }
};

const createMySQLConnectionPool = async (userId, databaseName) => {
    const connectionDetails = await getUserConnectionDetails(userId, databaseName);
    if (!connectionDetails) {
        return null;
    }

    const { host, port, username, password, database } = connectionDetails;
    const decryptedPassword = decrypt(password);
    return mysql.createPool({
        host,
        port,
        user: username,
        password: decryptedPassword,
        database,
    });
};

const executeQueryForUser = async (userId, databaseName, query) => {
    const pool = await createMySQLConnectionPool(userId, databaseName);
    if (!pool) {
        return null;
    }

    const [rows] = await pool.query(query);
    pool.end();
    return rows;
};

module.exports = { executeQueryForUser };

As we can see, we first need to retrieve the connection that matches the database chosen by the user. The connection pool is created after decrypting the password. The exported function, executeQueryForUser(userId, databaseName, query), executes a query on the selected database and returns the rows found as a result.

The API

// routes/index.js

const express = require('express');
const router = express.Router();
const userController = require('../controllers/users');
const connectionController = require('../controllers/connections');
const auth = require('../utils/auth');

router.post('/users/create', userController.createUser);
router.post('/users/login', userController.loginUser);
router.put('/users/update/:id', auth.authMiddleware, userController.updateUser);
router.delete('/users/delete/:id', auth.authMiddleware, userController.deleteUser);

router.get('/connections/:id/tables', auth.authMiddleware, connectionController.listTablesInUserDatabase);
router.get('/connections/:id/data', auth.authMiddleware, connectionController.listColumnsAndDataInTable);
router.post('/connections/create', auth.authMiddleware, connectionController.createConnection);

module.exports = router;

User CRUD

The CRUD (Create, Read, Update, Delete) operations are not of particular interest, and we will not dwell on them in this article, only listing them for completeness.

// controllers/users.js

const { hash } = require('../utils/hash');
const auth = require('../utils/auth');
const User = require('../models/user');

const createUser = async (req, res, next) => {
    try {
        const { name, email, password } = req.body;
        const hashedPassword = hash(password);
        const user = new User({
            name,
            email,
            password: hashedPassword
        });
        await user.save();
        return res.status(201).json(user);
    } catch (error) {
        return res.status(500).json({ error });
    }
};

const loginUser = async (req, res, next) => {
    try {
        const { email, password } = req.body;
        const hashedPassword = hash(password);
        const user = await User.findOne({ email, password: hashedPassword });
        if (!user) {
            return res.status(200).json({ error: 'Invalid credentials' });
        }
        const token = auth.generateToken(user);
        if (!token) {
            return res.status(200).json({ error: 'Token generation failed' });
        }
        return res.status(200).json({ token, email: user.email, name: user.name });
    } catch (error) {
        return res.status(500).json({ error });
    }
};

const updateUser = async (req, res, next) => {
    try {
        const id = req.params.id;
        const body = req.body;
        let update = {};
        if (body.name) {
            update.name = body.name;
        }
        if (body.email) {
            update.email = body.email;
        }
        if (body.password) {
            update.password = hash(body.password);
        }
        const user = await User.findByIdAndUpdate(id, update, { new: true });
        if (!user) {
            return res.status(200).json({ error: 'User not found' });
        }
        return res.status(200).json(user);
    } catch (error) {
        return res.status(500).json({ error });
    }
};

const deleteUser = async (req, res, next) => {
    try {
        const id = req.params.id;
        const user = await User.findByIdAndDelete(id);
        if (!user) {
            return res.status(200).json({ error: 'User not found' });
        }
        return res.status(200).json(user);
    } catch (error) {
        return res.status(500).json({ error });
    }
};


module.exports = {
    createUser,
    loginUser,
    updateUser,
    deleteUser
};

Operations on MySQL

The first step in operating on MySQL is to create a new connection and associate it with the user who entered its parameters.

// controllers/connections.js

const Connection = require('../models/connection');
const validator = require('validator');
const { encrypt } = require('../utils/hash');
const { executeQueryForUser } = require('../utils/mysql');
const User = require('../models/user');

const createConnection = async (req, res, next) => {
    const { host, port, username, password, database } = req.body;
    if (!host || !port || !username || !password || !database) {
        return res.status(200).json({ error: 'All fields are required' });
    }
    if(host !== 'localhost') {
        if(!validator.isIP(host)) {
            return res.status(200).json({ error: 'Invalid host' });
        }
    }
    if(!validator.isPort(port)) {
        return res.status(200).json({ error: 'Invalid port' });
    }
    const encryptedPassword = encrypt(password);
    const connection = new Connection({
        host,
        port,
        username,
        password: encryptedPassword,
        database,
        user: req.body.id
    });
    try {
        const conn = await connection.save();
        const user = await User.findById(req.body.id);
        user.connections.push(conn._id);
        await user.save();
        return res.status(201).json({ message: 'Connection created' });
    } catch (err) {
        return res.status(500).json({ error: err });
    }
};

The input data is validated by the validator module. If the host parameter is different from localhost, a valid IP address is required. Similarly, a valid number is requested for the port parameter, where "valid" means a numeric value within the standard port range.

Here, we notice a limitation in our code: hostnames other than localhost are not allowed, which could be an issue if we want to connect to a remote server that, due to its configuration, does not accept an IP address as a parameter.

Let’s take a moment to look at the connection creation and linkage to the users collection:

const conn = await connection.save();
const user = await User.findById(req.body.id);
user.connections.push(conn._id);
await user.save();

Mongoose emphasizes in its documentation the use of the save() method for both creating and updating a document. Here conn allows us to access the ObjectID of the newly created document. Likewise, instead of using the $push approach to insert the ObjectID into the connections array, we can modify the user instance using a simple JavaScript approach and then invoke save() to save the change.

Now, we define a method to list the tables present in the user-selected database.

// controllers/connections.js

const listTablesInUserDatabase = async (req, res, next) => {
    const { database } = req.query;
    if (!database) {
        return res.status(200).json({ error: 'Database name is required' });
    }
    const query = 'SHOW TABLES';
    const tables = await executeQueryForUser(req.params.id, database, query);
    const tableNames = tables.map((table) => table[`Tables_in_${database}`]);
    if (!tables) {
        return res.status(500).json({ error: 'Failed to fetch tables' });
    }
    return res.status(200).json({ tables: tableNames });
};

The MySQL query we are executing is SHOW TABLES, which in the mysql2 implementation returns an array of objects with a single property that is identical for all (Tables_in_${database}). We need to apply map() to transform it into a linear array of table names.

As a final step, we can list the columns and data in a table chosen by the user.

// controllers/connections.js

const listColumnsAndDataInTable = async (req, res, next) => {
    const { database, table } = req.query;
    if (!database || !table) {
        return res.status(200).json({ error: 'Database and table names are required' });
    }
    const page = req.query.page ? parseInt(req.query.page, 10) : 1;
    const perPage = 10;
    const offset = (page - 1) * perPage
    const queryColumns = `SHOW COLUMNS FROM ${table}`;
    const columns = await executeQueryForUser(req.params.id, database, queryColumns);
    const query = `SELECT * FROM ${table} LIMIT ${offset}, ${perPage}`;
    const data = await executeQueryForUser(req.params.id, database, query);
    if (!data) {
        return res.status(500).json({ error: 'Failed to fetch data' });
    }
    return res.status(200).json({ columns, data });
};

module.exports = { createConnection, listTablesInUserDatabase, listColumnsAndDataInTable };

The query SHOW COLUMNS FROM ${table} shows essential metadata that we can display to the user. For example:

const columns = [
 {
  "Field": "id",
  "Type": "int",
  "Null": "NO",
  "Key": "PRI",
  "Default": null,
  "Extra": "auto_increment"
 },
 {
  "Field": "title",
  "Type": "varchar(255)",
  "Null": "NO",
  "Key": "",
  "Default": null,
  "Extra": ""
 },
 //...
];

As we can see, we have all the data needed to create a view in the frontend that shows the selected table.

The Application

// app.js

require('dotenv').config();

const express = require('express');
const bodyParser = require('body-parser');
const cors = require('cors');
const mongoose = require('mongoose');
const PORT = process.env.PORT || 3000;
const routes = require('./routes');

mongoose.connect(process.env.DBURI);

const app = express();

app.use(cors());
app.use(bodyParser.json());

app.use('/api', routes);

app.listen(PORT);

The application can be provided with the port number on which to create the instance simply by adding the PORT variable to the .env file.

Conclusion

We have implemented the basics from which we can extend our application, adding more endpoints in the future to manage our MySQL databases. This is certainly a great way to study the core aspects of MySQL that we usually think of as available only from the console.

Back to top