This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Distinguished Data Engineers are individual contributors who strive to be diverse in thought so we visualize the problem space. At Capital One, we believe diversity of thought strengthens our ability to influence, collaborate and provide the most innovative solutions across organizational boundaries. Distinguished Engineers will significantly impact our trajectory and devise clear roadmaps to deliver next generation technology solutions. The Data Operations Center (DOC) serves as the backbone of our enterprise data ecosystem, ensuring the stability and scalability of our core e-data platforms. As a horizontal organization, we manage a diverse tech stack including Snowflake, Databricks, DynamoDB, File Transfers and legacy systems like Ab Initio to provide a high-performance foundation for the entire enterprise while housing an internal AI,ML Engineering team focused specifically on our evolution. By integrating AI,ML directly into our operations, we are transforming our model from reactive maintenance to a predictive, self-healing, and automated platform architecture. This collaborative structure ensures that the organization continuously evolves its support mechanisms, moving from reactive to predictive and automated operations.
Job Responsibility:
Lead the design and implementation of highly scalable, fault-tolerant, and cost-effective data architectures that seamlessly integrate Databricks (for complex processing,ML) and Snowflake (for warehousing,ETL)
Drive performance engineering and optimization for large-scale data ingestion and processing workloads across the Databricks,Snowflake,AWS data pipeline
Provide technical leadership for the design, development, testing and deployment of agentic workflows across Capital One
Deep technical experts and thought leaders that help accelerate adoption of the very best engineering practices, while maintaining knowledge on industry innovations, trends and practices
Evangelists, both internally and externally, helping to elevate the Distinguished Engineering community and establish themselves as a go-to resource on given technologies and technology-enabled capabilities
Build awareness, increase knowledge and drive adoption of modern technologies, sharing consumer and engineering benefits to gain buy-in
Strike the right balance between lending expertise and providing an inclusive environment where others’ ideas can be heard and championed
leverage expertise to grow skills in the broader Capital One team
Promote a culture of engineering excellence, using opportunities to reuse and innersource solutions where possible
Operate as a trusted advisor for a specific technology, platform or capability domain, helping to shape use cases and implementation in an unified manner
Lead the way in creating next-generation talent for Tech, mentoring internal talent and actively recruiting external talent to bolster Capital One’s Tech talent
Requirements:
Bachelor’s Degree
At least 7 years of experience in data engineering
At least 3 years of experience in data architecture
At least 2 years of experience building applications in AWS
At least 5 years of experience of Python, SQL or Scala
At least 3 years of experience developing AI and ML algorithms or technologies
Nice to have:
Masters’ Degree
5+ years of experience in data engineering (Hadoop,AWS,Snowflake,ETL etc.)
5+ years of experience in Data Governance, Data Governance Platforms, Data Standardization, and Data Modeling
5+ years of experience deploying scalable and responsible AI solutions on cloud platforms (e.g. AWS, Google Cloud, Azure, or equivalent private cloud)
Experience developing AI and ML algorithms or technologies (e.g. LLM Inference, Similarity Search and VectorDBs, Guardrails, Memory) using Python, C++, C#, Java, or Golang
Deep proficiency and strategic experience with Databricks (e.g., Delta Lake, Unity Catalog, MLflow, performance tuning Spark workloads)
Deep proficiency and strategic experience with Snowflake (e.g., Data Sharing, Snowpipe, external tables, security features, cost governance, advanced SQL, Stored Procedures)
What we offer:
comprehensive, competitive, and inclusive set of health, financial and other benefits that support your total well-being
performance based incentive compensation, which may include cash bonus(es) and/or long term incentives (LTI)