Bigdata Platform Engineer - Heredia, Costa Rica - Experian

Experian
Experian
Empresa verificada
Heredia, Costa Rica

hace 1 semana

Andrea Rodríguez

Publicado por:

Andrea Rodríguez

beBee Recruiter


Descripción
Full-time


Employee Status:
Regular


Role Type:
Home


Department:
Information Technology & Systems


Schedule:
Full Time


Shift:
Day Shift


Company Description:

Experian is the world's leading global information services company.

During life's big moments - from buying a home or a car, to sending a child to college, to growing a business by connecting with new customers - we empower consumers and our clients to manage their data with confidence.

We help individuals to take financial control and access financial services, businesses to make smarter decisions and thrive, lenders to lend more responsibly, and organizations to prevent identity fraud and crime.


We have 20,000 people operating across 44 countries and every day we're investing in new technologies, talented people, and innovation to help all our clients maximize every opportunity.

If you have the skills and "can do" attitude, we would love to talk to you


What you'll be doing

  • Responsible for implementation and ongoing administration of Hadoop infrastructures both Cloudera and AWS EMR.
  • Develop and maintain Terraform scripts and Infrastructure as code (IAC) for provisioning and managing AWS Resources.
  • Expert in software development lifecycle practices (branch/release strategies, peer review and merge practices), deliver innovative CI/CD solutions using emerging technology solutions.
  • Automating infrastructure and Big Data technologies deployment, build and configuration using DevOps tools.
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, HBase and Yarn access for the new users
  • Implement Security best practices for the big data platform (HBase, HDFS, KAFKA, HIVE..).
  • Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Enterprise, etc.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Optimize EMR Clusters for performance and costefficiency.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files, File system management and monitoring
  • HDFS support and maintenance
  • General operational expertise such as good troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • The most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure namenode high availability, schedule and configure it and take backups
  • Solid Understanding on premise and Cloud network architectures
  • Additional Hadoop skills like Sentry, Spark, Kafka, Oozie, etc
  • Advanced experience with AD/LDAP security integration with Cloudera, including Sentry and ACL configurations
  • Ability to configure and support API and OpenSource integrations
  • Experience working with DevOps environment, developing solutions utilizing Ansible, etc.
  • Will collaborate and communication with all levels of technical and senior business management
  • Will require oncall 24X7 support of production systems on a rotation basis with other team members
  • Proactively evaluate evolving technologies and recommend solutions to business problems.

Qualifications:


  • Bachelor's degree (in Computer Science or related field) or equivalent is desired.
  • 5+ years of Hadoop infrastructure administration
  • 3+ years of Linux (Redhat) system administration


  • Cloud Platforms IaaS/PaaS

  • Cloud solutions: AWS, Azure, VMWare.


  • Automation skills

  • Ansible and/or Terraform.
  • Kerberos administration skills
  • Experience with Cloudera distribution, AWS EMR
  • Experience in coding languages, especially Python
  • Good experience in CI/CD tools such as Jenkins, Puppet and Shell scripting
  • Must have knowledge on DevOps tools.
  • Working Knowledge of YARN, HBase, Hive, Spark, Kafka,Solr etc.
  • Advanced English proficiency.
  • Strong Problem Solving and creative thinking skills
  • Effective oral and written communications
  • Experience working with geographically distributed teams
  • Ability to adapt to multilingual and multicultural environment.
  • Ability to handle conflicting priorities.
  • Ability to learn.
  • Adaptability.
  • Receptive to change.
  • Ability to communicate with business users at all levels.
  • Analytical skills.
  • Selfmotivated and proactive.

Additional Information:


Our benefits include:

Medical, life and dental insurance, Asociacion Solidarista, International Share Save Plan, Flex Work, Work from home, Paid time off, Annual Performance Bonus, Education Reimbursement, Family Bonding, Bereavement Leave, Referral Program, and more.

Experian Careers - Creating a better tomorrow together

  • Experian is proud to be an Equal Opportunity and Affirmative Action employer_. _We're passionate about unlocking the power of data to transform lives and create opport

Más ofertas de trabajo de Experian