Short Description:
The Associate Data Engineer position at Zscaler in Bangalore involves evaluating and implementing data applications/technologies to meet evolving business needs. Collaborating with various teams, the role focuses on identifying data source requirements, profiling data quality, and building data pipelines for integration into Zscaler's Snowflake data warehouse. The responsibilities include continuous optimization of existing data integrations, development of large-scale data pipelines using modern cloud and big data architectures, and adherence to data management standards. The ideal candidate, with 0 to 3 years of experience, should possess proficiency in data modeling, data pipeline building, and programming languages like Python, along with strong analytical skills. Zscaler is an equal opportunity employer, promoting diversity and inclusivity, and complies with legal standards and pay transparency rules. Additionally, the company provides support for differently-abled candidates during the recruiting process.
Job Title: Associate Data Engineer
Location: Bangalore, India
About Zscaler
Zscaler (NASDAQ: ZS) is dedicated to accelerating digital transformation to make customers more agile, efficient, resilient, and secure. The Zscaler Zero Trust Exchange, their cloud-native platform, safeguards thousands of customers globally from cyber threats and data loss by securely connecting users, devices, and applications irrespective of their location.
With over a decade of experience in cloud development and operation, Zscaler serves thousands of enterprise customers worldwide, including 450 of the Forbes Global 2000 organizations. Beyond protecting against threats like ransomware and data exfiltration, Zscaler aids in cost reduction, complexity reduction, and enhanced user experiences by eliminating latency-inducing gateway appliances.
Established in 2007, Zscaler's mission is to ensure the cloud is a safe place for business and an enjoyable experience for enterprise users. Their security platform, purpose-built to defend against threats where connections occur—the internet—ensures every connection is both fast and secure, regardless of user connection methods or application locations.
Position: Associate Data Engineer
Responsibilities/What You’ll Do
Evaluate and Implement Data Technologies: Support the assessment and implementation of current and future data applications/technologies to meet evolving Zscaler business requirements.
Collaboration: Collaborate with IT business engagement and applications engineer teams, enterprise data engineering, and business data partner teams to identify data source requirements.
Data Profiling and Preparation: Profile and quantify the quality of data sources, develop tools to prepare data, and construct data pipelines for integration into Zscaler's data warehouse in Snowflake.
Optimization: Continuously optimize existing data integrations, data models, and views while developing new features and capabilities to address business partners' needs.
Data Management Standards: Work with Data Platform Lead to design and implement data management standards and best practices.
Technology Advancement: Continuously learn and develop next-generation technology/data capabilities to enhance data engineering solutions.
Large-Scale Data Pipelines: Develop large-scale and mission-critical data pipelines using modern cloud and big data architectures.
Qualifications/Your Background
Experience: 0 to 3 years of experience in data warehouse design and development.
Data Pipeline Proficiency: Proficiency in building data pipelines to integrate business applications (Salesforce, Netsuite, Google Analytics, etc.) with Snowflake.
Data Modeling: Proficiency in data modeling techniques (Dimensional) with the ability to write structured and efficient queries on large data sets.
Programming Skills: Hands-on experience in Python to extract data from APIs and build data pipelines.
SQL Proficiency: Completely proficient in advanced SQL, Python/Snowpark(PySpark)/Scala (any Object-Oriented language Concepts), and ML libraries.
ELT Tools: Strong hands-on experience in ELT Tools like Matillion, Fivetran, Talend, IDMC (Matillion preferred), data transformational tool – DBT, and using AWS services like EC2, S3, Lambda, Glue.
Data Visualization: Knowledge of data visualization tools such as Tableau and/or Power BI.
Analytical Skills: Must demonstrate good analytical skills, be detail-oriented, a team-player, and have the ability to manage multiple projects simultaneously.
Please click here to apply.
awesome opportunity
ReplyDeletei am interested
ReplyDeleteintersted
ReplyDeleteInterested
ReplyDeleteInterested
ReplyDeleteInterested
ReplyDeleteI'm interested
ReplyDeleteIam interested
ReplyDeleteI'm interested
ReplyDeleteSameer Raj
7766816363
rajsameer321@gmail.com
Interested
ReplyDeleterajshreenupur9835@gmail.com