aiflow / README.md
hijnu's picture
Update README.md
6317e8e verified
---
title: N8n
emoji: 💡🌀✨
colorFrom: green
colorTo: indigo
sdk: docker
pinned: false
---
# Project: Dockerized PostgreSQL with WebDAV Backup and Node.js Integration
This Docker image is built on top of **PostgreSQL 15** and provides an integrated environment for running PostgreSQL, performing automated database backups via WebDAV, and executing custom scripts written in **Node.js** and **Python**. This image is tailored for use with **n8n**, an open-source workflow automation tool, and it supports automatic backup management and webhook communication.
## Features
- **PostgreSQL Database**: Provides a fully functional PostgreSQL 15 database instance.
- **Automated Backups**: Includes automated backup functionality to a WebDAV server using custom scripts.
- **Node.js Environment**: Installs Node.js (default version 20) to allow running workflows or services that depend on Node.js.
- **Python Integration**: Python 3 is pre-installed, and Python packages can be managed through a virtual environment (`venv`).
- **Webhook Integration**: Allows interaction with external services via a webhook URL.
- **Configurable Environment**: Fully customizable through build-time arguments and environment variables.
- **WebDAV Support**: Automatic interaction with a WebDAV server for file backup and retrieval.
- **Custom Script Support**: Ships with custom shell scripts to manage database backups and data imports.
## Environment Variables
The Docker image uses the following environment variables for configuration. You can modify these variables at runtime to customize behavior:
| Variable | Default Value | Description |
|-----------------------|---------------------------------------|-------------|
| `POSTGRES_USER` | `n8n` | PostgreSQL user. |
| `POSTGRES_PASSWORD` | `n8n` | PostgreSQL password. |
| `POSTGRES_DB` | `n8n` | PostgreSQL database name. |
| `WEBHOOK_URL` | `https://aigenai-db.hf.space/` | Webhook URL for external communication. |
| `DB_IMPORT` | `yes` | If set to `yes`, imports the database on startup. |
| `NODEJS_VER` | `20` | Version of Node.js to install. |
| `WEBDAV_URL` | `https://cfr2.n8n.us.kg/` | URL of the WebDAV server for backups. |
| `WEBDAV_USER` | `your_username` | WebDAV username for authentication. |
| `WEBDAV_PASSWORD` | `your_password` | WebDAV password for authentication. |
| `N8N_PORT` | `7860` | Port on which n8n will be accessible. |
| `GENERIC_TIMEZONE` | `Asia/Shanghai` | Timezone for the Docker container. |
| `DB_TYPE` | `postgresdb` | Specifies the database type. |
| `DB_POSTGRESDB_HOST` | `localhost` | Hostname for the PostgreSQL database. |
| `DB_POSTGRESDB_PORT` | `5432` | Port for PostgreSQL. |
| `VIRTUAL_ENV` | `/app/venv` | Location of the Python virtual environment. |
## Build-Time Arguments
These build-time arguments can be provided during the image build to customize the resulting Docker image:
| Argument | Default Value | Description |
|--------------------|-----------------------------------------|-------------|
| `DUMP_URL` | `""` | URL for the initial database dump (optional). |
| `DUMP_PASSWORD` | `""` | Password for accessing the database dump (if necessary). |
| `POSTGRES_USER` | `n8n` | PostgreSQL user. |
| `POSTGRES_PASSWORD`| `n8n` | PostgreSQL password. |
| `POSTGRES_DB` | `n8n` | PostgreSQL database name. |
| `WEBHOOK_URL` | `https://aigenai-db.hf.space/` | Webhook URL for external services. |
| `NODEJS_VER` | `20` | Node.js version to install. |
| `WEBDAV_URL` | `https://cfr2.n8n.us.kg/` | WebDAV URL for backups. |
| `WEBDAV_USER` | `your_username` | WebDAV user. |
| `WEBDAV_PASSWORD` | `your_password` | WebDAV password. |
## Usage
### Build the Docker Image
To build the Docker image, use the following command, passing in any custom arguments as needed:
```bash
docker build --build-arg POSTGRES_USER=myuser \
--build-arg POSTGRES_PASSWORD=mypassword \
--build-arg WEBDAV_URL=https://mywebdavserver.com \
--build-arg WEBDAV_USER=mywebdavuser \
--build-arg WEBDAV_PASSWORD=mywebdavpassword \
-t custom-postgres-n8n:latest .
```
### Run the Docker Container
To run the container with customized environment variables:
```bash
docker run -d \
-e POSTGRES_USER=n8n \
-e POSTGRES_PASSWORD=n8n \
-e POSTGRES_DB=n8n \
-e WEBHOOK_URL=https://your-webhook.url/ \
-e WEBDAV_URL=https://your-webdav.url/ \
-e WEBDAV_USER=your_username \
-e WEBDAV_PASSWORD=your_password \
-p 7860:7860 \
custom-postgres-n8n:latest
```
### Backup and Restore
The image includes scripts for backing up and restoring the PostgreSQL database to and from a WebDAV server:
- **Backup**: The `backup.sh` script uploads a PostgreSQL database dump to the WebDAV server.
- **Import**: The `import-db.sh` script retrieves the latest database dump from the WebDAV server and imports it into PostgreSQL.
To manually trigger a backup or import, connect to the running container:
```bash
docker exec -it <container_id> bash
```
Then run:
```bash
# Backup the database
./backup.sh
# Import the database
./import-db.sh
```
### Webhook Integration
The `WEBHOOK_URL` can be configured to send and receive webhooks. This can be useful for triggering automated workflows or notifying external systems about database changes.
### Python and Node.js Environment
This image includes both **Python** and **Node.js**. Python packages are managed through a `requirements.txt` file, and Node.js packages through `package.txt`. You can install and run additional scripts using either Python or Node.js.
### Exposed Port
- **n8n**: Accessible on port `7860` by default.
## Custom Scripts
Three key scripts are provided:
- `run.sh`: Main entry point script.
- `import-db.sh`: Handles database imports from WebDAV.
- `backup.sh`: Manages automated backups to WebDAV.
## License
This project is licensed under the MIT License. See the LICENSE file for details.
---
This setup is ideal for environments where database management, automation, and cloud backups are required. It is especially suited for **n8n** users who want a robust and flexible system for managing their PostgreSQL database in conjunction with other automation workflows.