This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Data Engineer, you will be empowered to leverage data to drive amazing customer experiences and business results. You will own the end to end development of data engineering solutions to support analytical needs of the business. The ideal candidate will be passionate about working with disparate datasets and be someone who loves to bring data together to answer business questions at speed. You should have deep expertise in the creation and management of datasets and the proven ability to translate the data into meaningful insights through collaboration with analysts, data scientists and business stakeholders.You will partner with accounting and finance business teams in expanding Financial Data Repository to support new business initiatives. Sr Data Engineer of Finance Data Repository is most challenging and exciting role to work for, with new business initiative launches in payments, finance , tax and ordering platform. Its highly critical to manage the changes on daily basis for proper accounting to keep our books clean and accurate.
Job Responsibility:
Design and build mission critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events and batch), data integration, data curation
Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders
Build and support reusable framework to ingest, integration and provision data
Automation of end to end data pipeline with metadata, data quality checks and audit
Build and support a big data platform on the cloud
Define and implement automation of jobs and testing
Optimize the data pipeline to support ML workloads and use cases
Support mission critical applications and near real time data needs from the data platform
Capture and publish metadata and new data to subscribed users
Work collaboratively with business analysts, product managers, data scientists as well as business partners and actively participate in design thinking session
Participate in design and code reviews
Motivate, coach, and serve as a role model and mentor for other development team associates/members that leverage the platform
Requirements:
3 to 5 years' experience in data warehouse / data lake house technical architecture
3+ years of experience in using programming languages (Python / Scala / Java / C#) Minimum
3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming)
Minimum of 2 years' experience with AWS or engineering in other cloud environments
Strong Knowledge of Databricks SQL/Scala - Data Engineering Pipelines
Experience with Database Architecture/Schema design
Strong familiarity with batch processing and workflow tools such as dbt, AirFlow, NiFi
Ability to work independently with business partners and management to understand their needs and exceed expectations in delivering tools/solutions
Strong interpersonal, verbal and written communication skills and ability to present complex technical/analytical concepts to executive audience
Strong business mindset with customer obsession
ability to collaborate with business partners to identify needs and opportunities for improved data management and delivery
Experience providing technical leadership and mentoring other engineers for best practices on data engineering
Bachelor's degree in Computer Science, or a related technical field
What we offer:
opportunities for benefits (e.g., medical, dental), equity and discretionary bonuses