The Data and Analytics Engineer III is responsible for design, development and implementation of optimal solutions to integrate, store, process and analyze huge data sets. This includes an understanding of methodology, design, specifications, programming, delivery, monitoring, and support standards.
Individual must have extensive knowledge of designing and developing data pipelines and delivering advanced analytics, with open source Big Data processing frameworks such as Hadoop version 2 technologies. Individual must have proven competency in programming utilizing distributed computing principles.
The Data and Analytics Engineer III is also responsible for supporting the business goals and objectives for the Data Management & Analytics Department, the Information Management Organization, and the organization as a whole.
- Design & Development- Provides technical development expertise for designing, coding, testing, debugging, documenting and supporting all types of applications consistent with the established specifications and business requirements in order to deliver business value.
- Strategy Execution -- Contributes to the execution of CHRISTUS' overall information systems strategy as it pertains to their vision of the organization in both strategic and tactical plans. Involved in team adoption, execution and integration of strategy to achieve optimal and efficient deliver.
- System Engineering -- Involved in the evaluation of proposed system acquisitions or solutions development and provides input to the decision-making process relative to compatibility, cost, resource requirements, operations, and maintenance.
- System Integration -- Integrates software components, subsystems, facilities and services into the existing technical systems environment; assesses impact on other systems, and works with cross functional teams within information management to ensure positive project impact. Installs, configures, and verifies the operation of software components.
- System Management -- Participates in development of standards, design and implementation of proactive processes to collect and report data and statistics on assigned systems.
- System Security -- Participates in the research, design, development, and implementation of application, database, and interface security using technologies such as SSL, Public-Key encryption, and Certificates or other emerging security technologies.
- Bachelor degree in Computer Science, Engineering, Math or related field is required. Master's degree is preferred.
- Proficiency with Hadoop v2, MapReduce, Spark, HDFS, Python or R.
- Experience with data integration with ETL techniques and frameworks, such as Flume.
- Proficiency with Big Data querying tools, such as Pig, Hive, and Impala.
- Experience with messaging systems, such as Kafka or MQ
- Experience with Big Data Machine Learning toolkits, such as Mahout or SparkML.
- Knowledge of NoSQL databases, such as HBase, Cassandra, MongoDB
- Knowledge of Hadoop cluster management, with all included services
- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming is preferred.
- Good understanding of Lambda Architecture
- Advanced level of SQL programing and query performance tuning techniques for Data Integration and Consumption using design for optimum performance against large data asset within an OLTP, OLAP and MPP architecture.
- Solid understanding of BI and analytics landscape, preferable in large-scale development environments.
- Must have the communication skills and ability to develop and present solutions to all levels of management (including executive levels).
- Must have demonstrated the ability to solve complex problems with minimal direction.
- Must be able to interact effectively and patiently with customers especially while under pressure.
- The ability to work on multiple projects/tasks simultaneously to meet project deadlines for self and others as required.
- Ability to establish and maintain positive working relationships with other employees.
- Minimum of six (6) years of experience in MapReduce, Spark programming.
- Minimum of six (6) years of experience developing analytics solutions with large data sets within an OLAP and MPP architecture.
- Minimum of Ten (10) years of experience with design, architecture and development of Enterprise scale platforms built on open source frameworks.
C. Licenses, Registrations, or Certifications:
- Certifications in Hadoop or Java are a plus.