In many scenarios, it's necessary to export database data into Excel files for analysis or reporting. Recently, I had a requirement to export all table from an SQLite database into a single Excel file, with each table in its own worksheet. This article explains how to achieve this using only Python's built-in libraries and a free Excel processing library.
1. Environment Setup
1. Python Environment
It is recommended to use Python 3.6 or higher.
2. Install Required Libraries
sqlite3: A standard Python library, no need to install separately.Free Spire.XLS: A free library for creating, writting, and formatting Excel files.
Installation command:
pip install Spire.Xls.Free
2. Implementation Approach
The export proces can be broken down into the following 5 steps:
- Connect to the SQLite database
- Retrieve names of all user tables
- Create a new Excel workbook
- Iterate through each table:
- Read the structure (column names) and data
- Create a new worksheet (named after the table)
- Write headers and data
- Automatically adjust row heights and column widths
- Save the Excel file and close the database connection
3. Complete Code
from spire.xls import *
from spire.xls.common import *
import sqlite3
# ---------------------- 1. Connect to the database ----------------------
# Replace with your database file path
conn = sqlite3.connect("Sales Data.db")
cursor = conn.cursor()
# ---------------------- 2. Get all table names ----------------------
cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
table_names = [name[0] for name in cursor.fetchall()]
# ---------------------- 3. Create Excel workbook ----------------------
workbook = Workbook()
workbook.Worksheets.Clear() # Clear default generated worksheets
# ---------------------- 4. Write data table by table ----------------------
for table_name in table_names:
# 4.1 Get column names (headers)
cursor.execute(f"PRAGMA table_info('{table_name}')")
columns_info = cursor.fetchall()
column_names = [info[1] for info in columns_info]
# 4.2 Get table data
cursor.execute(f"SELECT * FROM {table_name}")
rows = cursor.fetchall()
# 4.3 Add new worksheet (named after the table)
sheet = workbook.Worksheets.Add(table_name)
# 4.4 Write headers
for i, col_name in enumerate(column_names):
sheet.Range[1, i + 1].Value = col_name
# 4.5 Write data rows
for row_idx, row_data in enumerate(rows):
for col_idx, cell_value in enumerate(row_data):
sheet.Range[row_idx + 2, col_idx + 1].Value = cell_value
# 4.6 Auto-adjust row height and column width
sheet.AllocatedRange.AutoFitRows()
sheet.AllocatedRange.AutoFitColumns()
# ---------------------- 5. Save and clean up resources ----------------------
workbook.SaveToFile("DataBaseToExcel.xlsx", FileFormat.Version2016)
workbook.Dispose()
conn.close()
print("Data export completed!")
The example uses SQLite. For exporting from MySQL, PostgreSQL, or other databases, simply replace the database connection part; the rest of the logic remains the same.
4. Key Points Explained
1. Retrieving Table Names
The sqlite_master system table stores metadata about all tables. Filtering by type='table' ensures that only user-defined tables are retrieved, excluding system tables.
2. Retrieving Column Names
The PRAGMA table_info command provides detailed information about a table's columns. The second element in each result corresponds to the column name, which is ideal for use as an Excel header.
3. Row and Column Indexing
The sheet.Range[Row, Column] indexing starts at 1, not 0. Therefore:
- Headers are written to row 1
- Data starts at row 2, corresponding to
row_idx + 2
4. Automatic Formatting
AllocatedRange automatically identifies the area filled with data, eliminating the need to manually calculate boundaries.
AutoFitRows() and AutoFitColumns() adjust row heights and column widths based on content, making the output more visually appealing.
5. Summary
This article presents a simple and practical solution for exporting database data using Python. With less than 50 lines of code, you can export multiple tables from a database into a single Excel file. It requires no complex frameworks or heavy tools, making it ideal for quick data exports, report generation, and daily data processing tasks.