An Apex client in Roanoke, VA is seeking a Technical Lead to is lead it's transition from a migration to a cloud hosted data lake. This exciting, long-term contract position will help build the next generation of data systems for our Fortune 500 client. If you possess the skills outline below and would like to discuss the opportunity further, please apply directly or submit your resume to email@example.com
- Provide expertise in design, identify the mode and syntax of data movement from various data sources to AWS Big data Lake
- Coordinate and share technical expertise on big data technologies
- Execute on tech architecture, produce design documentation for the sprint and update it incrementally update it
- Identify design related dependencies and conflicts between various application components
- Accountable for solution delivery and implementation quality and adherence to best practices
- Work closely with Product owner on product prioritization, Identify risks/issues and provide mitigation plan.
- Accountable for seamless delivery of work products as per finalized sprint plan.
- Lead the effort to fix bugs in all the test cycles and post production
- Accountable for smooth deployment of the validated work increment in the all environments
- Communicate the technical dependencies to the concerned teams and establish coordination for seamless implementation of the individual components (ex: dev ops)
- Plan for various environments, attend cab meetings and request approvals
- Lead the Warranty and support tasks for the SCRUM team
- Experience building and optimizing big data data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Multiple projects hands on experience with Analytics, Business intelligence and Data Visualization is Plus
- Proficient in designing efficient and robust ETL workflows and hands on experience with Teradata, Oracle, Data stage, AWS, Python/Java script, Glue. Big Data Hadoop stack is plus
- Builds large-scale data processing systems, is an expert in data warehousing solutions
- Strong SQL query experience able to read transformation logic in ETL tools such as Data Stage, SAPBODS and document in Data dictionary.
- AWS hands experience to read S3 content and create data dictionary, data research and study the patterns.