Original Post
**DESCRIPTION** We are seeking a visionary Data Architect to own and evolve the enterprise\-wide data architecture on Databricks for a major electric transmission and distribution operator in North America. This is a strategic, hands\-on leadership role responsible for designing the lakehouse blueprint that unifies grid telemetry, SCADA/OMS feeds, AMI meter data, GIS, disaster recovery operations, and field workforce system. You will serve as the technical authority across data engineering, analytics, and AI/ML platform decisions, ensuring compliance with federal and territorial regulatory frameworks. **KEY RESPONSIBILITIES** * Define and own the enterprise data architecture strategy on Databricks — establishing lakehouse design patterns, data modeling standards, and platform governance frameworks * Lead multi\-domain data model design covering grid operations, asset management, outage analytics, disaster recovery, and workforce management * Architect Unity Catalog implementation for enterprise\-wide governance — including lineage, access policies, classification, and audit\-readiness for regulatory bodies (PREPA, FEMA) * Establish data mesh or medallion architecture patterns enabling self\-serve analytics while maintaining central governance * Define integration architecture for real\-time ingestion from SCADA, OMS, AMI, and GIS systems using Kafka / Autoloader / Event Hubs * Provide technical leadership on Databricks capabilities — Delta Live Tables, MLflow, Databricks SQL, and Photon — ensuring best practices across all data teams * Partner with CDO/CTO/CIO to build a multi\-year data platform roadmap and evaluate emerging technologies (AI/BI, Genie, Delta Sharing) * Mentor and provide technical direction to data engineers and analytics engineers **REQUIRED SKILLS** **Platform \& Architecture** * Databricks Platform Architecture * Delta Lake / Lakehouse Design * Unity Catalog \& Data Governance * Data Mesh / Medallion Architecture * Enterprise Data Modeling * MLflow \& AI/ML Platform Design **Cloud \& Integration** * Azure / AWS Cloud Architecture * Apache Kafka / Event Streaming * Databricks SQL \& Photon Engine * dbt / SQL / PySpark * CI/CD \& DataOps (Git, Terraform) * SCADA / OMS / GIS data (preferred) **EXPERIENCE** * 8 years in data architecture with at least 3–4 years hands\-on with Databricks at enterprise scale * Bachelor's or Master's degree in Computer Science, Data Engineering, Information Architecture, or a related field * Databricks Certified Data Engineer Professional and/or Solutions Architect certification preferred * Proven experience designing lakehouse or data warehouse platforms for complex, multi\-source environments in regulated industries (utilities, energy, government, or infrastructure) * Experience leading or mentoring cross\-functional data engineering teams Job Type: Part\-time Pay: ₹25\.00 \- ₹30\.00 per hour Expected hours: 2 – 4 per week Work Location: Remote
Preparing for this role?
Practice with an AI interviewer tailored to Databricks Data Architect - Consultant at BluEnt E Solutions Private Limited.