This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
SoftClouds LLC is looking for a Azure ETL Developer --- with at least 8+ years of experience as an ETL Developer within large-scale enterprise projects. This hands-on role involves designing and implementing functionality, developing integrations for multiple applications. The ideal candidate will have a strong grasp of Azure cloud and extensive experience in the specified technology stack.
Job Responsibility:
Design, build, and maintain scalable data pipelines and modern data Lakehouse architectures using Microsoft Fabric, Synapse Analytics, and Azure Data Factory
Implement end-to-end data pipelines across bronze, silver, and gold layers within Microsoft Fabric
Develop Dataflows Gen2, Spark notebooks, and Synapse pipelines to ingest data from varied sources (databases, APIs, Excel/CSV files)
Manage and optimize data storage within One Lake, including partitioning, schema evolution, and Delta tables
Integrate curated data from pipelines with Power BI semantic models for advanced analytics and reporting
Implement data governance, lineage tracking, and security using Microsoft Purview, Defender for Cloud, and Azure Key Vault
Monitor, schedule, and optimize pipeline performance using tools such as Monitoring Hub and Azure Monitor
Automate deployments and workflows using CI/CD pipelines (for Fabric and Azure Data Factory) and Power Automate where appropriate
Create interactive Power BI dashboards and reports, leveraging Direct Lake mode and connecting to Lakehouse/Warehouse datasets
Apply best practices for data quality, cleansing, and transformation (using KQL, PySpark, or SQL)
Enforce data security policies and compliance, including role-based access and data masking
Manage and oversee the data warehouse, ensuring its optimal performance and reliability
Enhance existing database tables or create new ones in response to evolving reporting requirements
Ensure that all data is comprehensive, precise, and consistently synchronized with source systems
Requirements:
Proficiency in all the AWS Web Services below
MS Fabric
Azure
Synapse
Data pipelines
Lakehouse
Candidate must have solid research/troubleshooting and analytical skills
The ability to be able to dig into code or documentation to help them solve issues and leverage all resources available to them
Must be able to apply SDLC concepts and Agile Scrum methodologies
Has a proven track record of delivering solid, robust applications
Thorough knowledge of design and integration principles for complex IT environments
Understanding of the unique business and technical requirements on each of our engagements to facilitate both communications and the most appropriate solution design
Detailed design, development and unit / integration testing utilizing the appropriate methodologies, technology, and tools
Ability to produce client deliverables such as detailed design documentation, unit test plans and well-documented code and ensure deliverables are of the highest quality to promote client satisfaction
The candidate should also possess strong oral and written communication, problem-solving skills and should be a team player
Enthusiasm, attention to detail, and ability to work on a variety of projects are necessary
This position also requires excellent time management and communication skills
Requires a minimum of a bachelor’s degree in engineering preferable Computer Science/Engineering
Nice to have:
Knowledge of Health care domain
Hands on experience with Power BI
Knowledge of system design and integration
Analytical and problem-solving skills are required
Familiarity with security standards and best practices in application development