This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Applications Development Senior Programmer Analyst is responsible for establishing and implementing new or revised data platform ecosystems in coordination with the Technology team. The role focuses on building and maintaining data pipelines, optimizing data infrastructure, and delivering robust data products for analytics and data science teams.
Job Responsibility:
Build and maintain batch or real-time data pipelines in data platform
Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources
Develop ETL (extract, transform, load) processes to help extract and manipulate data from multiple sources
Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users
Automate data workflows such as data ingestion, aggregation, and ETL processing
Prepare raw data in Data Warehouses into a consumable dataset for both technical and non-technical stakeholders
Build, maintain, and deploy data products for analytics and data science teams on data platform
Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures
Monitor data systems performance and implement optimization solution
Has the ability to operate with a limited level of direct supervision
Can exercise independence of judgement and autonomy
Acts as SME to senior stakeholders and/or other team members
Serve as advisor or coach to new or lower level analysts
Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations
Requirements:
9 to 14 years of relevant experience in Data engineering role
Advanced SQL/ RDBMS skills and experience with relational databases and database design
Strong proficiency in object-oriented languages: Python, PySpark is must
Experience working with Bigdata - Hive/Impala/S3/HDFS
Experience working with data ingestion tools such as Talend or Ab Initio
Nice to working with data lakehouse architecture such as AWS Cloud/Airflow/Starburst/Iceberg
Strong proficiency in scripting languages like Bash, UNIX Shell scripting
Strong proficiency in data pipeline and workflow management tools
Strong project management and organizational skills
Excellent problem-solving, communication, and organizational skills
Proven ability to work independently and with a team
Experience in managing and implementing successful projects
Ability to adjust priorities quickly as circumstances dictate
Consistently demonstrates clear and concise written and verbal communication
Nice to have:
Nice to working with data lakehouse architecture such as AWS Cloud/Airflow/Starburst/Iceberg
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.