ETL Cloud Developer
Position Type: Hybrid - 3 days onsite
Contract Length: 2 months + extension
Seeking a senior ETL developer to design, build, and optimize cloud-based data management and warehousing solutions by extracting, transforming, and loading business data. The role involves collaborating with cross-functional teams, utilizing tools like Azure Data Factory, Databricks, Python, and SQL, and applying advanced data architecture techniques.
Required Skills:
10+ years designing and developing systems for data asset management, ETL processes, and business intelligence.
10+ years designing and supporting data warehouse schemas and developing data marts for new and existing data sources.
10+ years collaborating with data analysts, scientists, and business users to gather requirements and populate data hubs and warehouses.
10+ years of advanced understanding of data integrations, strong database architecture knowledge, and experience ingesting spatial data.
10+ years of experience resolving conflicts, prioritizing tasks, and managing multiple projects.
10+ years of proficiency with Microsoft Office tools: Word, PowerPoint, Excel, Project, Visio, and Team Foundation Server.
10+ years of experience with data warehousing architectures including Kimball and Inmon, and designing solutions across various data stores.
10+ years of hands-on experience with Azure technologies: Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse.
10+ years of experience with IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), Oracle, T-SQL, Azure SQL Database, and Azure SQL Data Warehouse.
10+ years of experience in Windows and Unix environments with scripting in Python and/or Linux shell scripting.
10+ years of experience in Azure cloud engineering.
- Preferred: 5+ years of experience with Snowflake.
Design and develop integrations for enterprise data assets, ETL processes, and business intelligence solutions.
Build and maintain data engineering processes that leverage a cloud-based architecture, including migrating legacy pipelines as needed.
Design and support data warehouse schemas and develop data marts for new and existing data sources.
Collaborate with data analysts, scientists, and other stakeholders to gather requirements and populate optimized data warehouse structures.
Partner with data modelers and architects to refine and implement business data requirements for building and maintaining enterprise data assets.