Dynamically Renaming Columns in PySpark Using Regex: A Guide for Developers and Data Engineers

Dhruv Singhal
3 min readMay 22

Renaming columns in a PySpark DataFrame is a common operation in data processing tasks. While renaming columns with fixed names is straightforward, there are scenarios where you need to dynamically rename columns based on certain conditions or patterns using regular expressions (regex). This article will guide you through the process of dynamically renaming columns using PySpark and regex, using real-life examples and jokes to keep things interesting.

Setup and Creating a DataFrame

Before we dive into dynamically renaming columns using regex, let’s start by setting up a PySpark environment and creating a sample DataFrame:

# Importing the necessary libraries
from pyspark.sql import SparkSession
from pyspark.sql.functions import col

# Creating a SparkSession
spark = SparkSession.builder.getOrCreate()

# Sample DataFrame
data = [("Alice", 25, "New York"), ("Bob", 30, "London"), ("Charlie", 35, "Paris")]
df = spark.createDataFrame(data, ["Name", "Age", "City"])

This code sets up a SparkSession and creates a DataFrame called df with columns Name, Age, and City. The data frame contains sample data for demonstration purposes.

Dynamically Renaming Columns using Regex

To dynamically rename columns using regex in a PySpark DataFrame, we can use the withColumnRenamed() method along with regular expressions for pattern matching. Here's the general approach:

# Importing the necessary regex library
import re

# Defining a dictionary to store column mappings
column_mapping = {"old_column_pattern": "new_column_name"}

# Renaming columns dynamically using regex
for old_col_pattern, new_col in column_mapping.items():
regex = re.compile(old_col_pattern)
renamed_cols = [col_name if not regex.match(col_name) else new_col for col_name in df.columns]
df = df.toDF(*renamed_cols)

In this approach, we define a dictionary called column_mapping that contains the mapping of old column patterns to new column names. We then iterate over the items in the dictionary and use regular expressions to match the old column names. If a match is found, we replace the old column name with the new column name using list comprehension. Finally, we use toDF() to rename the columns dynamically based on the regex pattern.

Dhruv Singhal

Data engineer with expertise in PySpark, SQL, Flask. Skilled in Databricks, Snowflake, and Datafactory. Published articles. Passionate about tech and games.