See all the jobs at Srijan Technologies PVT LTD here:
Technical Architect - Databricks
| Technology | Full-time | Partially remote
Apply by:
No close date
About US:-
We turn customer challenges into growth opportunities.
Material is a global strategy partner to the world’s most recognizable brands and innovative companies. Our people around the globe thrive by helping organizations design and deliver rewarding customer experiences.
We use deep human insights, design innovation and data to create experiences powered by modern technology. Our approaches speed engagement and growth for the companies we work with and transform relationships between businesses and the people they serve.
Srijan, a Material company, is a renowned global digital engineering firm with a reputation for solving complex technology problems using their deep technology expertise and leveraging strategic partnerships with top-tier technology partners. Be a part of an Awesome Tribe
Job Description – Databricks Architect
Key Responsibilities:
-
Lead the architecture, design, and implementation of end-to-end data solutions on the Databricks Lakehouse Platform.
-
Build and optimize scalable data pipelines for structured and unstructured data using Apache Spark on Databricks.
-
Define and enforce data engineering and architectural best practices.
-
Collaborate with data scientists, analysts, and business stakeholders to understand requirements and translate them into scalable solutions.
-
Architect and lead the design of CI/CD pipelines and automation for Databricks Pipelines.
-
Ensure data governance, security, and compliance in all solutions.
-
Provide technical leadership and mentorship to data engineering teams.
-
Evaluate new tools and technologies to continuously improve platform efficiency and performance.
-
Work with Azure/AWS/GCP cloud environments integrated with Databricks.
Required Skills & Qualifications:
-
8+ years of hands-on experience with Databricks, including Spark, Delta Lake, MLflow, and DBSQL.
-
Strong experience in building large-scale ETL/ELT data pipelines.
-
Solid knowledge of data modeling, data warehousing, and distributed computing.
-
Experience with cloud platforms (Azure, AWS, or GCP), preferably Azure.
-
Proven track record of implementing best practices in data engineering and architecture.
-
Proficiency in Python, SQL, and Spark-based processing.
-
Strong understanding of DevOps practices in the context of Databricks (CI/CD, infrastructure as code).
-
Excellent communication and stakeholder management skills.
Good to Have:
-
Experience or working knowledge of Microsoft Fabric.
-
Familiarity with Power BI, Synapse Analytics, or other modern BI tools.
-
Certifications related to Databricks or Azure (e.g., Databricks Certified Data Engineer, Microsoft Certified: Azure Solutions Architect)