Description:
Job Description: Primary Skills: Databricks, Pyspark Proficiency in PySpark and Databricks (Delta Lake, clusters, jobs). Experience is architecting designs for integrating DBX with different application like Sales Force & MDM etc Different tool like Collibra etc Hands-on with Apache Airflow (DAG design, monitoring). Strong in AWS services: S3, EC2, Lambda, IAM. Strong SQL and Python for transformations and orchestration. Knowledge of Lakehouse architecture (Delta Lake) and data modeling. Exper
Oct 17, 2025;
from:
dice.com