Software Developer - Scala/Big Data
Number of Openings:
Duties and Responsibilities:
- Possesses an understanding of how technology solutions meet the business outcomes and offers a range of solutions for business partners; understand business current and aspirational needs
- Participates in sprint planning; provides work estimates to deliver product stories; owns development stories
- Develops solutions on variety of platforms according to business requirements
- Completes required coding to satisfy the defined acceptance criteria and deliver desired outcome
- Leads solution design, considering risks, mitigations, performance, user experience, and testability
- Assists in development of automated testing and supporting code as necessary
- Completes required documentation to communicate information to deployment, maintenance, and business teams
- Utilizes agile software development practices, data and testing standards, code reviews, source code management, continuous delivery, and software architecture
- Participates in the full software development cycle including coding, testing, implementation, support and sunset
- Designs, develops, tests and supports software in support of big data objectives
- Possesses an understanding of User Experience practices to improve usability and interaction between the customer and product
- Adopts Service Design, where appropriate, through architecture modularity to enable continuous delivery
- Considers applying emerging technology solutions to increase efficiency and effectiveness; expectation of continuous innovation
- Resolves problems that result in a decreased time to market; improves quality, enhances flexibility, and embraces the solution provider mindset
- Provides input into overall testing plan; contributes to test approach and scenarios for requirements
- Provides product and/or process expertise necessary to support design, development, testing and execution of solutions
- Exhibits DevOps mindset where team is accountable for product from inception to sunset
- Uses knowledge of distributed computing techniques to design, develop and test scalable applications that operate on large volume datasets.
- Familiar with handling datasets containing mixes of structured and unstructured data.
- Transforms unstructured data into suitable forms for analysis and modeling.
- Performs extract, transform and load (ETL) integrations with variety of data sources.
- Writes ad-hoc scripts and queries, schedules batch jobs and develops real-time streaming applications and monitors.
- Background in technologies such as Hadoop, Spark, Pig, Hive, etc.
- Provides system software support for State Farm applications, components and testing software (testware).
3+ years of experience in the following:
- Linux Shell Scripting
- Demonstrated analysis and problem solving skills
- Ability to quickly adapt to a changing work environment
- Ability to learn quickly and coach/mentor others
- Ability to foster innovation, encourage diversity of thought, and incorporate new ideas
- Strong communication skills, verbal and written
Experience with the following is preferred, but not required:
- Agile Methodologies/Work Environment
- Change Management
- DevOps and Automated Deployment Tools (Jenkins, Maven, GitLab, UrbanCode, Docker, Artifactory)
- Build/Release Management
- CI/CD Pipelines
- Job may require travel via commercial transportation and/or driving motor vehicles
- Job may require irregular work hours and travel outside normal business hours