
Brillio
UMG- GCP Data Specialist - R01559803
Lead AI/ML Engineer\nSpecialization Key Skills Data Engineering & Analytics SQL (Advanced) Data Modeling & Query Optimization BigQuery (DDL
Salary
Competitive
Location
Remote
Job Type
Full Time
Posted
3mo ago
About the Role
Lead AI/ML Engineer
\nKey Skills
Data Engineering & Analytics
- SQL (Advanced)
- Data Modeling & Query Optimization
- BigQuery (DDL, Views, Authorized Views)
- ETL / ELT Pipeline Development
Cloud & GCP Services
- Google Cloud Platform (GCP)
- BigQuery
- Google Cloud Storage (GCS)
- Dataflow
- Pub/Sub
Programming & Tools
- Python (data processing, scripting)
- Java (basic understanding for data workflows)
- Apache Airflow (orchestration and scheduling)
- API Integration
Visualization & Reporting
- Looker Studio (dashboard development)
- Automated reporting using SQL
Roles & Responsibilities
- Design, develop, and optimize scalable data pipelines to ingest, process, and store data from Google Cloud Storage (GCS) to BigQuery.
- Build and maintain API-integrated workflows to extract data from external systems, process responses, and load results into downstream platforms.
- Implement Pub/Sub-based event-driven architectures supporting both real-time and batch data processing, including message handling and pipeline triggers.
- Create and manage BigQuery DDLs, views, and authorized views, ensuring secure access through appropriate roles and permissions.
- Optimize ETL pipelines and complex SQL queries to improve performance, reduce processing time, and enhance BigQuery warehouse efficiency.
- Perform DEV and UAT testing, validating business logic, data quality, and end-to-end pipeline stability.
- Apply business transformation logic to convert raw datasets into analytics- and reporting-ready data models.
- Develop weekly and monthly SQL-based reports and automate their distribution to business stakeholders via email.
- Design and publish interactive dashboards using Looker Studio to enable clear data visualization and actionable insights.
- Collaborate closely with clients, business analysts, and stakeholders to gather requirements and deliver accurate, efficient, and interpretable data solutions.
Required Qualifications
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Strong experience in SQL and BigQuery within large-scale data environments.
- Hands-on experience with GCP services, especially BigQuery, GCS, Dataflow, and Pub/Sub.
- Proficiency in Python for data manipulation and pipeline development.
- Experience with Apache Airflow or similar orchestration tools.
- Solid understanding of data modeling, ETL best practices, and performance tuning.
Good to Have
- Exposure to Java-based data processing frameworks
- Experience working in client-facing or consulting environments
- Knowledge of real-time data processing patterns and event-driven architectures
Key Skills
Data Engineering & Analytics
- SQL (Advanced)
- Data Modeling & Query Optimization
- BigQuery (DDL, Views, Authorized Views)
- ETL / ELT Pipeline Development
Cloud & GCP Services
- Google Cloud Platform (GCP)
- BigQuery
- Google Cloud Storage (GCS)
- Dataflow
- Pub/Sub
Programming & Tools
- Python (data processing, scripting)
- Java (basic understanding for data workflows)
- Apache Airflow (orchestration and scheduling)
- API Integration
Visualization & Reporting
- Looker Studio (dashboard development)
- Automated reporting using SQL
Roles & Responsibilities
- Design, develop, and optimize scalable data pipelines to ingest, process, and store data from Google Cloud Storage (GCS) to BigQuery.
- Build and maintain API-integrated workflows to extract data from external systems, process responses, and load results into downstream platforms.
- Implement Pub/Sub-based event-driven architectures supporting both real-time and batch data processing, including message handling and pipeline triggers.
- Create and manage BigQuery DDLs, views, and authorized views, ensuring secure access through appropriate roles and permissions.
- Optimize ETL pipelines and complex SQL queries to improve performance, reduce processing time, and enhance BigQuery warehouse efficiency.
- Perform DEV and UAT testing, validating business logic, data quality, and end-to-end pipeline stability.
- Apply business transformation logic to convert raw datasets into analytics- and reporting-ready data models.
- Develop weekly and monthly SQL-based reports and automate their distribution to business stakeholders via email.
- Design and publish interactive dashboards using Looker Studio to enable clear data visualization and actionable insights.
- Collaborate closely with clients, business analysts, and stakeholders to gather requirements and deliver accurate, efficient, and interpretable data solutions.
Required Qualifications
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Strong experience in SQL and BigQuery within large-scale data environments.
- Hands-on experience with GCP services, especially BigQuery, GCS, Dataflow, and Pub/Sub.
- Proficiency in Python for data manipulation and pipeline development.
- Experience with Apache Airflow or similar orchestration tools.
- Solid understanding of data modeling, ETL best practices, and performance tuning.
Good to Have
- Exposure to Java-based data processing frameworks
- Experience working in client-facing or consulting environments
- Knowledge of real-time data processing patterns and event-driven architectures
Job Details
Location
Remote
Salary
Competitive
Job Type
Full Time
Work Mode
remote
Posted
3mo ago

Brillio
Remote · Full Time · Actively Hiring