Location: Riverwoods, Illinois
Employment Type: Contract
Job ID: 690
Date Added: 02/12/2024
Description
- Develops and troubleshoots data integration solutions with complex data transformations and provides guidance to other team members.
- Influences other team members to achieve commitments per guidance from Chapter Leads and actively contributes to agile ceremonies.
- Demonstrates strong technical aptitude across data engineering practices:
- Utilizing variety of tools to profile, secure the data in transit and at rest; and to enforce data Governance Controls and Alerting
- Designing advanced SQL queries
- Leveraging metadata-driven framework for solutions
- Developing test scripts for unit and integration testing
- Develops test methodologies for specific products.
- Leads code review sessions and other process and operational improvement initiatives.
- Exhibits fluency with use of supplemental tools and technologies involved in data integration (Unix/Linux, TWS/Control-M or alike, BI stack)
- Works on holistic solutions, driving feature and story delivery (Agile)
- Identifies and effectively communicates upstream and downstream impacts for changes in the data pipeline.
- Participates in the on-call rotation for support.
- Demonstrates effective and clear communication in team and cross-functional meetings, and lead tech communities.
- Builds strong collaborative working relationships both within the team and cross-functionally.
- 6+ years of proven Experience with ETL Tools (Ab Initio, Data Stage, Informatica)
- Experience in programming languages (Unix scripting, Python, Java, etc.)
- Experienced in other technology tools and stacks such as CI/CD, Jenkins, GitHub, Python, Spark, and Open Source
- Experience in supplemental tools and technologies involved in data integration (Unix/Linux, TWS/Control-M or alike, BI stack)
- Experience working in cloud platforms (AWS, GCP, Azure)
- Experience optimizing SQL both relational and NoSQL.
- Proven capability in creating artifacts to comply with Enterprise Data Governance standards (Source to Targets, MetaData Definition files, etc.)
- Experience in documenting current state processes and developing designs and reusable data pipelines that are operationally resilient.
- Possess an understanding of key infrastructure concepts (distributed data platforms) to support building data intensive applications.
- Financial & MarTech background experience is a huge plus.