CASY-MSCCN Jobs

CASY-MSCCN Logo

Job Information

Nielsen Data Engineer - (Spark, Scala, Python, Cassandra, Elasticsearch, AWS, Airflow, SQL) in Bangalore, India

At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future.

Responsibilities

  • Work closely with team leads and backend developers to design and develop functional, robust pipelines to support internal and customer needs

  • Write both unit and integration tests, and develop automation tools for daily tasks

  • Develop high quality, well documented, and efficient code

  • Manage and optimize scalable pipelines in the cloud

  • Optimize internal and external applications for performance and scalability

  • Develop automated tests to ensure business needs are met, and write unit, integration, or data quality tests

  • Communicate regularly with stakeholders, project managers, quality assurance teams, and other developers regarding progress on long-term technology roadmap

  • Recommend systems solutions by comparing advantages and disadvantages of custom development and purchased alternatives

Key Skills

  • Domain Expertise

  • 2+ years of experience as a software/data engineer

  • Bachelor’s degree in Computer Science, MIS, or Engineering

  • Technical Skills

  • Experience in software development using programming languages & tools/services: Java or Scala, Big Data, Hadoop, Spark, Spark SQL, Presto \ Hive, Cloud (preferably AWS), Docker, RDBMS (such as Postgres and/or Oracle), Linux, Shell scripting, GitLab, Airflow, Cassandra & Elasticsearch.

  • Experience in big data processing tools/languages using Apache Spark Scala.

  • Experience with orchestration tools: Apache Airflow or similar tools.

  • Strong knowledge on Unix/Linux OS, commands, shell scripting, python, JSON, YAML.

  • Agile scrum experience in application development is required.

  • Strong knowledge in AWS S3, PostgreSQL or MySQL.

  • Strong knowledge in AWS Compute: EC2, EMR, AWS Lambda.

  • Strong knowledge in Gitlab /Bitbucket .

  • AWS Certification is a plus

  • "Big data" systems and analysis

  • Experience with data warehouses or data lakes

  • Mindset and attributes

  • Strong communication skills with ability to communicate complex technical concepts and align organization on decisions

  • Sound problem-solving skills with the ability to quickly process complex information and present it clearly and simply

  • Utilizes team collaboration to create innovative solutions efficiently

DirectEmployers