We're always searching for amazing people to join our team. Take a look at our current openings.


Benefits of Working at Data Piper

Medical Care

We provide very competitive medical plans which include both PPO and HMO. We also offer excellent Dental and Vision plans as well.

Growth & Development

Ample opportunities for training and working on latest technologies are provided to the right employees

Time Off

We provide very competitive time off for our full time salaried employees

Current Openings

Posted 1 month ago

  • Must be hands-on BigQuery development
  • Hands-on CloudSQL
  • Advanced python and bash scripting experience
  • Advanced knowledge of the GCP Admin APIs on various services for automation gcloud SDK, python SDK for GCP
  • GCP Certification is a huge plus.

Posted 1 month ago

  • Strong Terraform experience is a must
  • Experience in the life cycle of infra as code is crucial
  • Ideal resources should be able to operate effectively at the enterprise level of complexity.
  • Ideal resources should be hands-on.
  • Intelligently differentiates between CI and CD, and has implemented both
  • Familiarity with modern test frameworks and development tools - Jenkins, Docker, Chef, Puppet, etc. 

Must Have Skills:

  • Significant experience is needed in terraform in non-trivial orchestration on approaches that go beyond the native providers as well as approaches that are set up for proper reuse.
  • Experience in larger environments using terraform and the lifecycle.
  • Experience to work in a large environment and tackle the complexity of running on cloud-native technologies.
  • Good hands-on DevOps skills, Spinnaker and terraform experience for assisting in the delivery of infrastructure automation.
  • Need a little better screening that looks into the depth and complexity of the environments in their experience
  • Should have at least some cloud-native, agile, best practices in the candidates.
  • Knowledge of segregation and immutable approach and at least exposed to the concepts and their value for the future.
  • Worked in environments that are moving to the more modern ways of developing to increase velocity and quality.

Posted 1 month ago

  • Minimum 2years of hands-on experience with using GCP services (i.e. Dataproc, Pub/Sub, BigQuery, Data flow, CloudRun/Cloud Functions, GKE, etc.) with rich backgrounds in Big Data technologies
  • Must have worked for hands-on large transfers into GCP target databases, and created strong throughput data pipelines
  • Experience in SQL tuning to address performance and cost considerations
  • Minimum 2 years of hands-on experience with Google (Google Cloud Platform), Google Compute, DevOps, Storage, and Security components
  • Strong background building custom data pipelines with target d/b into ALL GCP tables (Bigtable, GCS, Bigquery)
  • Minimum 2 years of hands-on experience with Hadoop, Sqoop, Hive, and Spark
  • Prior experience migrating workload; on-premise to cloud, Legacy Modernization (desirable);
  • Knowledge of VPC, Private/Public Subnet, Network Security Groups, Firewalls
  • Strong analytical, problem-solving, data analysis, and research skills