About Dunzo:

Dunzo is an Indian company that delivers groceries and essentials, fruits and vegetables, meat, pet supplies, food, and medicines in major cities. It also has a separate service to pick up and deliver packages within the same city.

Job Responsibilities:

Ingest and aggregate data from internal and external sources to build our datasets. 

Build large-scale batch and real-time data pipelines with data processing frameworks like Apache Beam, and Spark on the Google Cloud Platform.

Continuously optimize our Data warehouse and ensure it is performant, cost-effective, and secure.

Use best practices in continuous integration and delivery using CI/CD tools and orchestration frameworks like Airflow and DBT.

Help drive optimization, testing, and tooling to improve data quality.

Collaborate with other engineers, business analytics and stakeholders, taking learning and leadership opportunities that will arise daily.

Eligibility Criteria:

Any Graduate from Any stream

Have at least 0-1 year of working with heterogeneous data and systems.

You know how to ensure data quality and are an agent for the automation of the data platform.

Preferred skill:

Committed to quality, including security and performance

Strong interpersonal, communication, and presentation skills

APPLY HERE