Lead Big Data Developer

Date: May 9, 2019

Location: BROOMFIELD, CO, US, 80021 SEATTLE, WA, US, 981340000 GARDNER, KS, US, 66030 MINNEAPOLIS, MN, US, 554132620 PHOENIX, AZ, US, 850040000 MONROE, LA, US, 71203

CenturyLink (NYSE: CTL) is a global communications and IT services company focused on connecting its customers to the power of the digital world. CenturyLink offers network and data systems management, big data analytics, managed security services, hosting, cloud, and IT consulting services. The company provides broadband, voice, video, advanced data and managed network services over a robust 265,000-route-mile U.S. fiber network and a 360,000-route-mile international transport network. Visit CenturyLink for more information.

 

Job Summary

As a Big Data Developer, you will be responsible for Cloudera Hadoop development, high-speed querying, managing and deploying Flume, Kafka, HIVE and Spark, and oversee handover to operational teams and propose best practices / standards. Expertise with Designing, building, installing, configuring and developing Hadoop echo system. Familiarity with Pentaho and Nifi a bonus skillset.

 

Job Description

Work with development teams within the data and analytics team to design, develop, and execute solutions to derive business insights and solve clients' operational and strategic problems. Support the development of data and analytics solutions and product that improve existing processes and decision making. Build internal capabilities to better serve clients and demonstrate thought leadership in latest innovations in big data, and advanced analytics. Contribute to business and market development.

 

Specific skills and abilities:

  • Defining job flows
  • Managing and Reviewing Hadoop Log Files
  • Manage Hadoop jobs using scheduler
  • Cluster Coordination services through Zookeeper
  • Support MapReduce programs running on the Hadoop cluster
  • Ability to write MapReduce jobs
  • Experience in writing Spark scripts
  • Hands on experience in HiveQL
  • Familiarity with data loading tools like Flume, Sqoop
  • Knowledge of workflow/schedulers like Oozie
  • Knowledge of ETL tools like Pentaho

 

Qualifications

  • Bachelor’s degree or related technical field preferred
  • Expertise with HBase, NOSQL, HDFS, JAVA map reduce for SOLR indexing, data transformation, back-end programming, java, JS, Node.js and OOAD
  • 7 + years’ of experience in IT with minimum 2 years’ of experience in Hadoop.

 

Alternate Location: US-Colorado-Broomfield; US-Colorado-Denver; US-Colorado-Littleton; US-Kansas-Gardner; US-Louisiana-Monroe; US-Ohio-Dublin; US-Washington-Seattle

Requisition #: 211864

This job may require successful completion of an online assessment. A brief description of the assessments can be viewed on our website at http://find.centurylink.jobs/testguides/ 

EEO Statement

We are committed to providing equal employment opportunities to all persons regardless of race, color, ancestry, citizenship, national origin, religion, veteran status, disability, genetic characteristic or information, age, gender, sexual orientation, gender identity, marital status, family status, pregnancy, or other legally protected status (collectively, “protected statuses”).  We do not tolerate unlawful discrimination in any employment decisions, including recruiting, hiring, compensation, promotion, benefits, discipline, termination, job assignments or training.

Disclaimer

The above job definition information has been designed to indicate the general nature and level of work performed by employees within this classification. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of employees assigned to this job.  Job duties and responsibilities are subject to change based on changing business needs and conditions.


Nearest Major Market: Denver

Job Segment: Database, Consulting, Developer, Java, Programmer, Technology



Share this Job