• Monitor and maintain all the data platform and analytical processes which includes the Hadoop Data Lake ecosystem.
• This is a support administration role with the potential of some development responsibilities on minor enhancements and break/fixes.
• Ensure all system issues are resolved in a timely manner.
• Leverage standard Issue, Problem and Change Management ITIL toolset in tracking and providing status on support work activities.
• Coordinate the necessary operations documentation and standards required for compliance and identified best practices.
• Perform all software upgrades and patching partnering with the necessary Infrastructure teams.
• Provide work estimates as needed. Facilitate and lead meetings with end users.
• Keeps management, team members and business stakeholders informed of critical issues, status, changes, etc.
MAJOR RESPONSIBILITIES % OF TIME
• Monitor and ensure stability of the analytical & data platforms/environments/processes in order to meet the defined SLA. 30%
• Partner with various business partners to gather requirements for minor enhancements. 25%
• Work effectively with other Technology Operations/Infrastructure and the responsible technology/3rd party support teams to ensure continued operations and maintenance of analytical & data management data platforms/environments/processes/ 20%
• Support Release Management and Change Control processes along with other UPT Compliance Initiatives. 10%
• Partner with various business partners to debug, test, develop and implement fixes for the customer / campaign management data platforms/environments/processes. 20%
EDUCATION: Bachelor’s degree in computer science is required.
SKILLS AND EXPERIENCE:
• 7+ years of demonstrated experience working as part of large Information Technology teams and/or consulting organizations partnering with clients/business groups to support complex Big Data platform or BI analytics environment.
• 2+ years of experience in installation and administration of Hadoop ecosystem.
• 2+ years of experience in the data integration with Hadoop ecosystem.
o Certifications in Horton Works HDP and HDF administration or development preferred.
o Experience in operational support of HDFS and edge application including Hive (TEZ, LLAP), Spark, Ranger, HBase, Knox, Kerberos.
o Experience with Nifi as ETL tool is preferred.
o Strong skill set in troubleshooting and resolving HDFS cluster issues using Ambari.
• 5+ years of experience in development or operational support of Enterprise BI analytics systems.
o Core understanding of EDW architecture.
o Strong SQL skills in RDBMS platform (MSsql,DB2,etc.)
• 2+ years of experience in operational support of Linux server OS.
o RedHat 6 or higher certified administrator preferred.
o JDBC and ODBC connections.
o Ability to read/interpret Java.
• Ability to produce high quality technical documentation.
• Strong knowledge of Agile project management methodologies/processes.
• Able to understand and interpret Entity-Relationship, logical and physical data models.
• Able to work with various platforms and databases (i.e. DB2, SQL Server). Able to write complex SQL and leverage backend databases.
• Participate in an on-call rotation and available to work off-hours and weekends.
• Strong interpersonal skills, with a demonstrated ability to make effective decisions while working through complex system issues.
• Must be able to utilize and effectively communicate functional and technical components of an initiative to applicable parties both verbally and through documentation.
• Attention to detail, good analytical and problem solving skills and critical thinking
• Self-starter/motivator and have a proactive, agile and strategic mindset.
• Bachelor’s degree or higher in a computer science field.