Responsibilities
● Design, develop, and maintain scalable data pipelines and ETL processes
● Optimize data flow and collection for cross-functional teams
 ● Build infrastructure required for optimal extraction, transformation, and loading of data
● Ensure data quality, reliability, and integrity across all data systems
● Collaborate with data scientists and analysts to help implement models and algorithms
 ● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc.
● Create and maintain comprehensive technical documentation
● Evaluate and integrate new data management technologies and tools

 Requirements
● 3-5 years of professional experience in data engineering roles
● Bachelor's degree in Computer Science, Engineering, or related field; Master's degree preferred Job Description
● Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata)
● Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink)
● Proficiency in at least one programming language such as Python, Java, or Scala
● Experience with data modeling, data warehousing, and building ETL pipelines
● Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi)
● Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred
● Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink
● Understanding of data governance and data security principles
 ● Experience with version control systems (e.g., Git) and CI/CD practices

Preferred Skills
● Experience with containerization and orchestration tools (Docker, Kubernetes)
● Basic knowledge of machine learning workflows and MLOps
● Experience with NoSQL databases (MongoDB, Cassandra, etc.) ●
Familiarity with data visualization tools (Tableau, Power BI, etc.)
● Experience with real-time data processing
● Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.)
● Experience with infrastructure-as-code tools (Terraform, CloudFormation)

Personal Qualities
 ● Strong problem-solving skills and attention to detail
 ● Excellent communication skills, both written and verbal
● Ability to work independently and as part of a team
● Proactive approach to identifying and solving problems