Senior Big Data Solutions Designer (GCP)
Tredence Inc.
All India • 2 months ago
Experience: 4 to 8 Yrs
PREMIUM
Deal of the Day
--:--:--
15 Days Free Trial
After Free Trial → Flat 50% OFF
Upgrade to CVX24 Premium
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Auto-forward profile to 10 top recruiters
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$0
Activate
$0
A small token amount will be charged to verify.
Get Refund in 48 Hours.
Free Earplugs Delivery Only after Payment of Rs. 99 for Five Consecutive Months.
After free-trial 6 Months subscription will be auto Activated @ $
1
(Cancel Anytime). Quoted price includes 50% discount.
Enter Your Details
Job Description
As a GCP Architect at Tredence, your role will involve working on the collecting, storing, processing, and analyzing of huge sets of data. Your primary focus will be on choosing optimal solutions for these purposes, maintaining, implementing, and monitoring them, and integrating them with the architecture used across our clients.
**Key Responsibilities:**
- Select and integrate any Big Data tools and frameworks required to provide requested capabilities.
- Develop and maintain data pipelines implementing ETL processes, monitor performance, and advise on necessary infrastructure changes.
- Translate complex technical and functional requirements into detailed designs.
- Investigate and analyze alternative solutions to data storing, processing, etc. to ensure the most streamlined approaches are implemented.
- Serve as a mentor to junior staff by conducting technical training sessions and reviewing project outputs.
**Qualifications Required:**
- Solid understanding of data warehousing and data modeling techniques.
- Proficient understanding of distributed computing principles such as Hadoop v2, MapReduce, and HDFS.
- Strong data engineering skills on GCP cloud platforms like Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, and Big Query.
- Experience with building stream-processing systems using solutions such as Storm or Spark-Streaming.
- Good knowledge of Big Data querying tools like Pig, Hive, and Impala.
- Experience with Spark, SQL, and Linux.
- Knowledge of various ETL techniques and frameworks such as Flume, Apache NiFi, or DBT.
- Experience with various messaging systems like Kafka or RabbitMQ.
- Good understanding of Lambda Architecture, along with its advantages and drawbacks.
At Tredence, you will have the opportunity to work with some of the smartest, friendliest, and hardest-working people in the data analytics space. You will work with the latest technologies and interface directly with key decision-makers at our clients, some of the largest and most innovative businesses in the world. We offer a 401k match, full medical, dental, and vision benefits, a fun team atmosphere, and work-life balance. Our people are our greatest asset, and we value every one of them. Come see why we are so successful in one of the most competitive and fastest-growing industries in the world.
*Tredence 5 core values:*
- Think Impact
- Be Entrepreneurial
- Constantly curious
- Do the right thing
- Pursue Excellence
Tredence is an equal opportunity employer, celebrating and supporting diversity and committed to creating an inclusive environment for all employees. Visit our Website, YouTube page, LinkedIn page for more details. As a GCP Architect at Tredence, your role will involve working on the collecting, storing, processing, and analyzing of huge sets of data. Your primary focus will be on choosing optimal solutions for these purposes, maintaining, implementing, and monitoring them, and integrating them with the architecture used across our clients.
**Key Responsibilities:**
- Select and integrate any Big Data tools and frameworks required to provide requested capabilities.
- Develop and maintain data pipelines implementing ETL processes, monitor performance, and advise on necessary infrastructure changes.
- Translate complex technical and functional requirements into detailed designs.
- Investigate and analyze alternative solutions to data storing, processing, etc. to ensure the most streamlined approaches are implemented.
- Serve as a mentor to junior staff by conducting technical training sessions and reviewing project outputs.
**Qualifications Required:**
- Solid understanding of data warehousing and data modeling techniques.
- Proficient understanding of distributed computing principles such as Hadoop v2, MapReduce, and HDFS.
- Strong data engineering skills on GCP cloud platforms like Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, and Big Query.
- Experience with building stream-processing systems using solutions such as Storm or Spark-Streaming.
- Good knowledge of Big Data querying tools like Pig, Hive, and Impala.
- Experience with Spark, SQL, and Linux.
- Knowledge of various ETL techniques and frameworks such as Flume, Apache NiFi, or DBT.
- Experience with various messaging systems like Kafka or RabbitMQ.
- Good understanding of Lambda Architecture, along with its advantages and drawbacks.
At Tredence, you will have the opportunity to work with some of the smartest, friendliest, and hardest-working people in the data analytics space. You will work with the latest technologies and interface directly with key decision-makers at our clients, some of the largest and most innovative businesses in the world. We offer a 401k match, full medical, dental, and vision benefits, a fun team atmosphere, and work-life balance. Our people are our greatest asset, and we value every one of them. Come see why we are so successful in one of the most competitive and fastest-growing industries in
Skills Required
data warehousing
data modeling
distributed computing
Hadoop
MapReduce
HDFS
Airflow
Data Fusion
Data Flow
Storm
Pig
Hive
Impala
Spark
SQL
Linux
Flume
DBT
Kafka
RabbitMQ
GCP cloud platforms
Cloud Composer
Data Proc
Big Query
streamprocessing systems
SparkStreaming
Big Data querying tools
ETL techniques
Apache NiFi
messaging systems
Lambda Architecture
Posted on: March 6, 2026
Relevant Jobs
Step 2 of 2