We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
Job Description: Must have good experience working in large scale application development using Big Data ecosystem Hadoop (HDFS, MapReduce, Yarn), Hive, Kafka. Should have hands on experience using Spark, Sacal & Spark SQL Hand.
Minimum 3 years of proficiency in Python for data manipulation and insights extraction from large datasets. Advanced degree in Statistics, Data Science, Mathematics, Computer Science, or a related quantitative field. Proven experience with cloud data platforms and modern model and data pipeline tech
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.
We also welcome couriers who have worked with other peer-to-peer ridesharing or delivering networks.