PySpark | How to Remove Non-ASCII Characters from a DataFrame?

PySpark | How to Remove Non-ASCII Characters from a DataFrame?

When working with text data in Spark, you might come across special characters that don’t belong to the standard English alphabet. These characters are called non-ASCII characters. For example, accented letters like é in “José” or symbols like emojis 😊. Sometimes, you may need to clean your data by removing these characters. This article will show you how to identify and remove non-ASCII characters from a Spark DataFrame.

What Are ASCII and Non-ASCII Characters?

ASCII Characters: These include standard English letters, numbers, and common symbols like @, #, or !. Their Unicode values range from 0 to 127.
Examples: Hello, 123, @Spark.
Non-ASCII Characters: These include characters outside this range, such as accented letters, non-English text, or emojis.
Examples: José, 你好, 😊.

Why Remove Non-ASCII Characters?

You might want to remove non-ASCII characters for these reasons:

  1. Data Cleaning: Non-ASCII characters can make your data messy or incompatible with some systems.
  2. Standardization: Some tools or systems might only accept ASCII characters.
  3. Simplification: Removing these characters can make text easier to process or analyze.

Steps to Remove Non-ASCII Characters:

Let’s go through the steps to clean your Spark DataFrame by removing non-ASCII characters.
Step 1: Create a Sample DataFrame: Here’s a simple DataFrame that contains both ASCII and non-ASCII characters:

from pyspark.sql import SparkSession
from pyspark.sql.functions import col, regexp_replace

# Start Spark session
spark = SparkSession.builder.appName("Remove Non-ASCII Characters").getOrCreate()

# Sample data
data = [("José", 25), ("René", 30), 
("Hello", 35), ("你好", 40), 
("😊", 45)]
columns = ["Name", "Age"]
df = spark.createDataFrame(data, columns)

# Show the original DataFrame
df.show()

Output:

+-----+---+
| Name|Age|
+-----+---+
| José| 25|
| René| 30|
| Hello| 35|
| 你好| 40|
| 😊| 45|
+-----+---+

Step 2: Remove Non-ASCII Characters: You can use PySpark’s regexp_replace() function to find and remove all non-ASCII characters.

# Remove non-ASCII characters from the 'Name' column
cleaned_df = df.withColumn("Name", 
             regexp_replace(col("Name"), r'[^\x00-\x7F]', ""))

# Show the cleaned DataFrame
cleaned_df.show()

Output:

+-----+---+
| Name|Age|
+-----+---+
| Jose| 25|
| Rene| 30|
| Hello| 35|
|     | 40|
|     | 45|
+-----+---+

Explanation: The regular expression [^\x00-\x7F] matches all characters that are not ASCII. regexp_replace() function replaces these characters with an empty string (“”).

Step 3: Handle Empty Strings: Sometimes, after removing non-ASCII characters, some rows may become empty. You can filter out these rows to clean your data further.

# Remove rows where the 'Name' column is empty
cleaned_non_empty_df = cleaned_df.filter(col("Name") != "")

# Show the cleaned DataFrame
cleaned_non_empty_df.show()

Output:

+-----+---+
| Name|Age|
+-----+---+
| Jose| 25|
| Rene| 30|
| Hello| 35|
+-----+---+

Step 4: Replace Non-ASCII Characters Instead of Removing: If you don’t want to completely remove non-ASCII characters, you can replace them with a placeholder like ?.

# Replace non-ASCII characters with a placeholder
placeholder_df = df.withColumn("Name", 
                 regexp_replace(col("Name"), r'[^\x00-\x7F]', "?"))

# Show the updated DataFrame
placeholder_df.show()

Output:

+-------+---+
|   Name|Age|
+-------+---+
|   Jos?| 25|
|   Ren?| 30|
|  Hello| 35|
|    ?? | 40|
|      ?| 45|
+-------+---+

Conclusion: Cleaning non-ASCII characters in PySpark is easy using the regexp_replace function. You can remove these characters to make your data cleaner and easier to process or replace them with placeholders if needed. This approach is especially helpful when dealing with messy text data in large datasets. By following these steps, you can ensure your data is clean, consistent, and ready for further analysis or processing.

Leave a Reply

Your email address will not be published. Required fields are marked *

📢 Need further clarification or have any questions? Let's connect!

Connect 1:1 With Me: Schedule Call


If you have any doubts or would like to discuss anything related to this blog, feel free to reach out to me. I'm here to help! You can schedule a call by clicking on the above given link.
I'm looking forward to hearing from you and assisting you with any inquiries you may have. Your understanding and engagement are important to me!

This will close in 20 seconds