Short Description:
Seeking a Senior Data Engineer in Noida, Uttar Pradesh, with 3-5 years of Big Data experience, proficient in Hadoop, Spark, Python/Java/Scala, and real-time data pipeline development. Responsibilities include optimizing data handling for petabytes, troubleshooting technical issues, and implementing emerging Hadoop Eco-System technologies. Join a pioneering digital payments platform, contributing to impactful work, leveraging vast user data, and offering wealth creation opportunities based on India's largest digital lending story.
Data Engineering - Senior Data Engineer
Location: Noida, Uttar Pradesh
Domain: Technology – Lending
Employment Type: On-roll / Remote
Position Title: Data Engineering – Senior Software Engineer
About Us:
Paytm stands at the forefront of India's digital payments and financial services sector, dedicated to engaging both consumers and merchants through a diverse range of payment solutions. Our platform offers services like utility payments, money transfers, and a variety of Paytm Payment Instruments (PPI), including Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag, and Paytm Postpaid - Buy Now, Pay Later. Catering to merchants, Paytm provides acquiring devices such as Soundbox, EDC, QR, and Payment Gateway, facilitating payment aggregation through PPI and other financial instruments. Additionally, we enhance merchants' businesses with commerce services through advertising and the Paytm Mini app store. Leveraging this platform, we extend credit services such as merchant loans, personal loans, and BNPL (Buy Now, Pay Later), sourced through our financial partners.
About the Role:
The Senior Data Engineer role involves working on intricate technical projects within an innovative and fast-paced environment. We seek a professional with a strong product design sense, specializing in Hadoop and Spark technologies.
Requirements:
- Minimum 3-5 years of experience in Big Data technologies.
- Contribute to the growth of our analytics capabilities by developing faster, more reliable tools for handling petabytes of data daily.
- Innovate and build new platforms to support our goal of providing cluster users with diverse data shapes and forms, ensuring low latency and horizontal scalability.
- Diagnose and resolve problems across the entire technical stack.
- Design and implement a real-time events pipeline for data ingestion, supporting real-time dashboards.
- Develop complex and efficient functions to transform raw data sources into robust components of our data lake.
- Design and implement new components utilizing emerging technologies in the Hadoop Eco-System, ensuring successful project execution.
Skills for Success:
- Proficiency in Hadoop, MapReduce, Hive, Spark, PySpark, etc.
- Excellent programming/debugging skills in Python/Java/Scala.
- Experience with scripting languages such as Python, Bash, etc.
- Knowledge of noSQL databases like Hbase, Cassandra is a plus.
- Hands-on programming experience with multithreaded applications.
- Experience in Database, SQL, and messaging queues like Kafka.
- Familiarity with developing streaming applications (e.g., Spark Streaming, Flink, Storm, etc.).
- Experience with AWS and cloud technologies such as S3.
- Knowledge of caching architectures like Redis is advantageous.
Why Join Us:
- Make a meaningful difference and enjoy the journey.
- Be challenged and encouraged to contribute to something meaningful for both yourself and those we serve.
- Work with a team that values the potential of technology to positively impact people.
- Join a successful organization where collective energy and unwavering focus on the customer drive our successes.
Compensation: For the right fit, we believe in creating wealth for you. With over 500 million registered users, 21 million merchants, and a wealth of data in our ecosystem, we are uniquely positioned to democratize credit for deserving consumers and merchants. Join us and become part of India's largest digital lending story!
Please click here to apply.
- Get link
- X
- Other Apps
Comments
Post a Comment
Please feel free to share your thoughts and discuss.