Startup Careers

Be a part of our family by contributing to our portfolio companies’ innovation and success. Browse open positions below with Intel Capital portfolio companies.

Hadoop Platform Software Engineer at DataRobot
Boston, MA, US / Columbus, OH, US


  • Integrates existing products with Hadoop and supports existing Hadoop infrastructure.
  • Provides automation for testing of existing products with different Hadoop distributions (such as Cloudera & Hortonworks).
  • Creates software for integration of existing products with different Hadoop distributions.
  • Creates plans for further improvements to DataRobot’s Hadoop experience.
  • Confirms that clients’ systems and infrastructures are acceptable for running DataRobot on Hadoop.
  • Makes sure existing automation is reliable and extends existing automation with Hadoop related pipelines.
  • Makes sure new application releases are tested properly with Hadoop.
  • Checks for new versions of Hadoop and verifies that application works with these versions of Hadoop.
  • Makes sure Hadoop portion of the application meets standards of Hadoop providers.
  • Provides documentation and tests for new software.
  • Improves job knowledge by studying architecture of different distributed computing frameworks and Hadoop ecosystem.
  • Participates in educational opportunities and reads professional publications.
  • Contributes to team effort by accomplishing related results as needed.




Main Requirements:  

  • Bachelor’s degree in Computer Information Systems, Project Management, Engineering, Physics, or related
  • 3 years of experience as a Software Engineer or related occupation
  • 3 years supporting Hadoop customer integrations and administering enterprise production Hadoop clusters.
  • 3 years Linux system configuration and administration.
  • 3 years of experience with each of the following:
    • Hadoop ecosystem security technologies such as Kerberos, LDAP, and Active Directory.
    • Core Hadoop architecture components (HDFS, YARN, ZooKeeper).
    • Cloud technologies such as AWS.
    • Configuration management tools such as Ansible, Puppet, Chef.
    • Java or Scala and scripting in Python and Bash.
    • Multiple Hadoop distributions (Cloudera, Hortonworks, etc.).
  • 3 years of experience with Docker.
  • Hadoop certification from Cloudera or Hortonworks.