

Picture by Creator | ChatGPT
The Hidden Value of Routine SQL Reporting
Knowledge groups throughout organizations face the identical recurring problem: stakeholders require common stories, however guide SQL reporting consumes invaluable time that may very well be spent on evaluation. The method stays constant no matter firm dimension — connect with the database, execute queries, format outcomes, and distribute findings to decision-makers.
Knowledge professionals routinely deal with reporting duties that do not require superior statistical information or area experience, but they devour vital time by way of repetitive execution of the identical queries and formatting procedures.
This workflow addresses a basic effectivity downside: remodeling one-time setup into ongoing automated supply {of professional} stories on to stakeholder inboxes.
The Resolution: A 4-Node Automated Reporting Pipeline
Constructing on our earlier n8n exploration, this workflow tackles a unique automation problem: scheduled SQL reporting. Whereas our first tutorial targeted on information high quality evaluation, this one demonstrates how n8n handles database integration, recurring schedules, and electronic mail distribution.
In contrast to writing standalone Python scripts for reporting, n8n workflows are visible, reusable, and simple to switch. You’ll be able to join databases, carry out transformations, run analyses, and ship outcomes — all with out switching between completely different instruments or environments. Every workflow consists of “nodes” that symbolize completely different actions, linked collectively to create an automatic pipeline.
Our automated SQL reporter consists of 4 linked nodes that rework guide reporting right into a hands-off course of:
- Schedule Set off – Runs each Monday at 9 AM
- PostgreSQL Node – Executes gross sales question towards database
- Code Node – Transforms uncooked information into formatted HTML report
- Ship Electronic mail Node – Delivers skilled report back to stakeholders
Constructing the Workflow: Step-by-Step Implementation
Conditions
Step 1: Set Up Your PostgreSQL Database
We’ll create a sensible gross sales database utilizing Supabase for this tutorial. Supabase is a cloud-based PostgreSQL platform that gives managed databases with built-in APIs and authentication—making it superb for speedy prototyping and manufacturing purposes. Whereas this tutorial makes use of Supabase for comfort, the n8n workflow connects to any PostgreSQL database, together with AWS RDS, Google Cloud SQL, or your group’s present database infrastructure.
Create Supabase Account:
- Go to supabase.com and join free
- Create new mission – select any title and area
- Watch for setup – takes about 2 minutes for database provisioning
- View your connection particulars from the Settings > Database web page (or the “join” button on the principle web page)
Load Pattern Knowledge:
Navigate to the SQL Editor in Supabase and run this setup script to create our gross sales database tables and populate them with pattern information:
-- Create staff desk
CREATE TABLE staff (
emp_id SERIAL PRIMARY KEY,
first_name VARCHAR(50),
last_name VARCHAR(50),
division VARCHAR(50)
);
-- Create gross sales desk
CREATE TABLE gross sales (
sale_id SERIAL PRIMARY KEY,
emp_id INTEGER REFERENCES staff(emp_id),
sale_amount DECIMAL(10,2),
sale_date DATE
);
-- Insert pattern staff
INSERT INTO staff (first_name, last_name, division) VALUES
('Mike', 'Johnson', 'Gross sales'),
('John', 'Doe', 'Gross sales'),
('Tom', 'Wilson', 'Gross sales'),
('Sarah', 'Chen', 'Advertising');
-- Insert current gross sales information
INSERT INTO gross sales (emp_id, sale_amount, sale_date) VALUES
(1, 2500.00, CURRENT_DATE - 2),
(1, 1550.00, CURRENT_DATE - 5),
(2, 890.00, CURRENT_DATE - 1),
(2, 1500.00, CURRENT_DATE - 4),
(3, 3200.00, CURRENT_DATE - 3),
(4, 1200.00, CURRENT_DATE - 6);
Paste this complete script into the SQL Editor and click on the “Run” button within the bottom-right nook. It is best to see “Success. No rows returned” confirming that your tables and pattern information have been created efficiently.
Take a look at Your Connection:
Inside the identical SQL Editor, run a contemporary question to confirm every part works: SELECT COUNT(*) FROM staff;
It is best to see 4 staff within the outcomes.
Step 2: Configure Gmail for Automated Sending
Allow App Password:
- Activate 2-step verification in your Google Account settings
- Generate app password – go to Safety > App passwords
- Choose “Mail” and “Different” – title it “n8n reporting”
- Copy the 16-character password – you will want this for n8n
Step 3: Import and Configure the Workflow
Import the Template:
- Obtain the workflow file
- Open n8n and click on “Import from File”
- Choose the downloaded file – all 4 nodes seem mechanically
- Save the workflow as “Automated SQL Reporting”
The imported workflow incorporates 4 linked nodes with all of the complicated SQL and formatting code already configured.
Configure Database Connection:
- Click on the PostgreSQL node
- Get your connection particulars from Supabase by clicking the “Join” button in your predominant web page. For n8n integration, use the “Transaction pooler” connection string because it’s optimized for automated workflows:
- Create new credential together with your Supabase particulars:
- Host: (your-project).supabase.com
- Database: postgres
- Consumer: postgres…..
- Password: (from Supabase settings)
- Port: 6543
- SSL: Allow
- Take a look at connection – it is best to see a inexperienced success message
Configure Electronic mail Settings:
- Click on the Ship Electronic mail node
- Create SMTP credential:
- Host: smtp.gmail.com
- Port: 587
- Consumer: your-email@gmail.com
- Password: (your app password)
- Safe: Allow STARTTLS
- Replace recipient within the “To Electronic mail” discipline
That is it! The evaluation logic mechanically adapts to completely different database schemas, desk names, and information varieties.
Step 4: Take a look at and Deploy
- Click on “Execute Workflow” within the toolbar
- Watch every node flip inexperienced because it processes
- Verify your electronic mail – it is best to obtain the formatted report
- Toggle to “Lively” to allow Monday morning automation
As soon as the setup is full, you will obtain automated weekly stories with none guide intervention.
Understanding Your Automated Report
This is what your stakeholders will obtain each Monday:
Electronic mail Topic: 📊 Weekly Gross sales Report – June 27, 2025
Report Content material:
- Clear HTML desk with correct styling and borders
- Abstract statistics calculated mechanically from SQL outcomes
- Skilled formatting appropriate for govt stakeholders
- Timestamp and metadata for audit trails
This is what the ultimate report seems like:
The workflow mechanically handles all of the complicated formatting and calculations behind this skilled output. Discover how the report consists of correct forex formatting, calculated averages, and clear desk styling—all generated instantly from uncooked SQL outcomes with none guide intervention. The e-mail arrives with a timestamp, making it simple for stakeholders to trace reporting durations and preserve audit trails for decision-making processes.
Technical Deep Dive: Understanding the Implementation
Schedule Set off Configuration:
The workflow runs each Monday at 9:00 AM utilizing n8n’s interval scheduling. This timing ensures stories arrive earlier than weekly group conferences.
SQL Question Logic:
The PostgreSQL node executes a classy question with JOINs, date filtering, aggregations, and correct numeric formatting. It mechanically:
- Joins worker and gross sales tables for full data
- Filters information to final 7 days utilizing
CURRENT_DATE - INTERVAL '7 days'
- Calculates whole gross sales, income, and averages per particular person
- Orders outcomes by income for enterprise prioritization
HTML Era Logic:
The Code node transforms SQL outcomes into skilled HTML utilizing JavaScript. It iterates by way of question outcomes, builds styled HTML tables with constant formatting, calculates abstract statistics, and provides skilled touches like emojis and timestamps.
Electronic mail Supply:
The Ship Electronic mail node makes use of Gmail’s SMTP service with correct authentication and HTML rendering assist.
Testing with Totally different Situations
To see how the workflow handles various information patterns, attempt these modifications:
- Totally different Time Intervals: Change
INTERVAL '7 days'
toINTERVAL '30 days'
for month-to-month stories - Division Filtering: Add
WHERE e.division="Gross sales"
for team-specific stories - Totally different Metrics: Modify SELECT clause to incorporate product classes or buyer segments
Primarily based on your corporation wants, you possibly can decide subsequent steps: weekly stories work properly for operational groups, month-to-month stories swimsuit strategic planning, quarterly stories serve govt dashboards, and every day stories assist with real-time monitoring. The workflow adapts mechanically to any SQL construction, permitting you to shortly create a number of reporting pipelines for various stakeholders.
Subsequent Steps
1. Multi-Database Assist
Exchange the PostgreSQL node with MySQL, SQL Server, or any supported database. The workflow logic stays an identical whereas connecting to completely different information sources. This flexibility makes the answer invaluable throughout various know-how stacks.
2. Superior Scheduling
Modify the Schedule Set off for various frequencies. Arrange every day stories for operational metrics, month-to-month stories for strategic planning, or quarterly stories for board conferences. Every schedule can goal completely different recipient teams with tailor-made content material.
3. Enhanced Formatting
Prolong the Code node to incorporate charts and visualizations utilizing Chart.js, conditional formatting based mostly on efficiency thresholds, or govt summaries with key insights. The HTML output helps wealthy formatting and embedded graphics.
4. Multi-Recipient Distribution
Add logic to ship completely different stories to completely different stakeholders. Gross sales managers obtain particular person group stories, executives obtain high-level summaries, and finance groups obtain revenue-focused metrics. This focused strategy ensures every viewers will get related info.
Conclusion
This automated SQL reporting workflow demonstrates how n8n bridges the hole between information science experience and operational effectivity. By combining database integration, scheduling, and electronic mail automation, you possibly can get rid of routine reporting duties whereas delivering skilled outcomes to stakeholders.
The workflow’s modular design makes it notably invaluable for information groups managing a number of reporting necessities. You’ll be able to duplicate the workflow for various databases, modify the SQL queries for varied metrics, and alter the formatting for various audiences—all with out writing customized scripts or managing server infrastructure.
In contrast to conventional ETL instruments that require intensive configuration, n8n’s visible interface makes complicated information workflows accessible to each technical and non-technical group members. Your SQL experience stays the core worth, whereas n8n handles the automation infrastructure, scheduling reliability, and supply mechanisms.
Most significantly, this strategy scales together with your group’s wants. Begin with easy weekly stories, then broaden to incorporate information visualizations, multi-database queries, or integration with enterprise intelligence platforms. The inspiration you construct right this moment turns into the automated reporting infrastructure that helps your group’s progress tomorrow.
Born in India and raised in Japan, Vinod brings a world perspective to information science and machine studying training. He bridges the hole between rising AI applied sciences and sensible implementation for working professionals. Vinod focuses on creating accessible studying pathways for complicated subjects like agentic AI, efficiency optimization, and AI engineering. He focuses on sensible machine studying implementations and mentoring the following technology of information professionals by way of dwell periods and personalised steering.