Country:United States of America
Location:CAG23: DLS VRF- Atlanta 3300 Riverwood Pkwy , Atlanta, GA, 30339 USADirector, Cloud Data Platform and Data Lake Are you inspired by big data and the endless possibilities it creates? At Carrier we are starting the journey to harness big data to improve everyday experiences for our customers. As our customers go about their daily lives, they generate endless data that touches all business units at Carrier and represents a tremendous opportunity. By accessing that data, we want to develop real-time decision systems that provide the underlying resources to innovate on our data. At Carrier we want to redefine how data is consumed and want to bring-in cutting-edge data streaming technologies while building custom experiences that humanize the creation and consumption of data. To succeed, you have to be passionate about building enterprise products and delivering amazing experiences for internal stakeholders. For this team, we are looking for a data platform technical product manager and architect that demonstrate deep knowledge of data streaming platforms while at the same time defines a vision of the data platform and coordinates efforts across the enterprise. In this role you will be: Building and managing a product roadmap for a new Data lake platform that serves the needs of streaming producers across Carrier Developing master data governance framework, including data governance strategy, approach, and roadmap. Understanding user needs as well as understanding the technical requirements and limitations when building platform services Managing cross dependencies across teams to insure alignment, as well as adjusting team directions when gaps exist Establish data dictionary and authoritative sources for the core data elements required to support Finance processes and reporting. Recommend technological solutions to improve data quality and data integration across the financial platform. Partner with business unit technology teams and other enterprise data functions to drive the long-term development of data infrastructure, including data warehousing, reporting, and analytics platforms. Review architectural designs and IT solutions to ensure consistency, maintainability, flexibility. Partner with business leadership to identify problems, and opportunities for technology innovation. Help establish a clear, consistent technology vision through collaboration, influence, and enablement. Anticipates and manages technology evolution and relates it to business solutions. Sets direction based on alignment to business and overall technology standards. Stay informed on initiatives across the industry and the enterprise to help product owners and IT leadership effectively prioritize. Craft guidance that helps delivery teams identify quality risk factors around security and compliance. Identify and assess the organizational impact of enterprise Big Data Architecture and standards, including change in skills, processes and structures. Accelerate delivery by forming a network of positive relationships with vendors, and other influencers within the enterprise and beyond. Promote the company's technology brand through creative thinking, and constantly raising the bar of what's possible. What you should bring to the table: A deep learning and understanding of technical fundamentals of platform technologies. Ability guide the design and architecture of the new data lake that will be built from scratch Ability to work across the enterprise to onboard new internal customers and integrate streaming app data to a single data lake platform Unique combination of deep technical understanding of enterprise software development and product management expertise Ability to understand all aspects of Carrier's data ecosystem, including spanning from data governance controls to data quality, and becoming the expert on data streaming technologies Ability to work independently across geographically diverse teams Ability to intuitively identify impediments on the critical path and unblock them Structured problem solving to help a team work through a wide range of issues Strong analytical mindset and ability to use sound data and logic to make compelling business cases. Ability to design, implement and support production Enterprise Data Lake, procedures and workflows Develop methodology to address various analytic challenges, covering both internal and client-driven needs. Strong understanding of service-oriented architecture and how (near) real-time integrations work Ambition and figure it out attitude - ability to independently investigate internal and third-party data sources, technical specs, programming, and operational challenges with limited assistance. Analytical problem solving to clearly identify, analyze, and resolve issues as they arise Passion for ensuring data access and handling is in stringent accordance to financial compliance regulations Large scale relational database, data lake, data warehouse experience with a strong grasp of SQL, to access, analyze, and assemble data to meet campaign targeting and personalization requirements. In depth understanding of the different technologies that can be used to build datalake and analyze it's data such as NoSQL DBs, Hadoop, Spark, Scala, distributed computing, AWS services e.g. Elastic Map Reduce (EMR), Snowflake, Lambda, EC2, etc., Experience Requirement: Masters or PhD in data sciences, computer science or related field 5+ years of prior experience with AWS, SQL and no-SQL with focus on Data Modeling (adequately skilled in data normalization), Data warehousing (prior experience building data marts and data lakes), Data Science (adequate expertise building inference engines on big data platforms for predictive and machine learning systems) and Data Analysis (prior experience with compliance reporting systems, audit and data governance). 5+ years of experience with Hadoop/DFS. Should have prior experience with building high performance NoSQL datastores for high traffic and high velocity data attributes, practical experience with storage formats, expertise with support for Schema stamping and schema evolution, past experience with Data Governance and access control issues related to NoSQL stores 5 years in Big Data architecture 5+ years in data strategies 1+ years machine learning Ability to build business case on data platform growth and specific scenarios 10+ years in data integration 5+ years in AWS Big Data Stack 3+ years in Spark, Scala or Python and data streaming 1+ years integrating APIs, RESTful services 3+ years with data ingestion -5+ years of experience in platform product management, developer services or enterprise data management. -2+ year experience using real-time data systems such as Kafka, Kinesis, or MQ. -1+ year of experience working within the Scaled Agile framework. -1+ year of program management experience. At least 5 years of experience with SQL At least 1 years of experience working with AWS and S3
United Technologies Corporation is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Click on this link to read the Policy and Terms