pyspark.sql.functions.map_zip_with¶
- 
pyspark.sql.functions.map_zip_with(col1, col2, f)[source]¶
- Merge two given maps, key-wise into a single map using a function. - New in version 3.1.0. - Parameters
- col1Columnor str
- name of the first column or expression 
- col2Columnor str
- name of the second column or expression 
- ffunction
- a ternary function - (k: Column, v1: Column, v2: Column) -> Column...Can use methods of- Column, functions defined in- pyspark.sql.functionsand Scala- UserDefinedFunctions. Python- UserDefinedFunctionsare not supported (SPARK-27052).
 
- col1
- Returns
 - Examples - >>> df = spark.createDataFrame([ ... (1, {"IT": 24.0, "SALES": 12.00}, {"IT": 2.0, "SALES": 1.4})], ... ("id", "base", "ratio") ... ) >>> df.select(map_zip_with( ... "base", "ratio", lambda k, v1, v2: round(v1 * v2, 2)).alias("updated_data") ... ).show(truncate=False) +---------------------------+ |updated_data | +---------------------------+ |{SALES -> 16.8, IT -> 48.0}| +---------------------------+