Experience: 4–6 Years
Job Description
We are looking for an experienced Data Engineer with strong expertise in PySpark, Python, or Scala and hands-on experience in Google Cloud Platform (GCP). The ideal candidate should be passionate about big data processing, cloud technologies, and scalable data solutions.
Key Skills & Requirements
- Minimum 4+ years of experience in data engineering
- Strong proficiency in Py Spark, Python, or Scala (with Spark)
- Mandatory: Hands-on experience with Google Cloud (GCP) and Big Query
- Solid knowledge of SQL — ability to write and optimize complex queries
- Exposure to REST or Graph QL schemas (added advantage)
- Understanding of distributed systems and big data processing tools
- Cloud certification in GCP is a plus
- Knowledge of AI/ML concepts is an added advantage
Responsibilities
-
Design, develop, and maintain large-scale data pipelines using Spark and GCP.
-
Work with cross-functional teams to integrate and process large datasets efficiently.
-
Ensure data quality, consistency, and security across all environments.
-
Optimize performance for data retrieval and analytics using BigQuery.
- Troubleshoot and resolve data-related issues independently.
Apply :
Send your resume to: hr@techfokes.com
