o Minimum of 7 years of experience working with the Apache Hadoop Ecosystem of tools and technologies to extract, integrate, cleanse and organize data, including experience with either the Hortonworks or Cloudera distributions.
Key Tools and Technologies
- Experience working with the following types of workloads and data pipelines:
o Enterprise-scale ETL and ELT batched workloads
o Near real-time micro-batches
o Streaming data
- Experience working with Data Governance frameworks
- Some experience performing conceptual and logical data model design
- Experience in the Financial Services, Retail industry, or Healthcare Payer or Provider industries is a plus.
- Strong NoSQL, SparkSQL, and ANSI SQL query language skills
- Strong verbal and written communication and English language skills