Skip to main content
Brillio

Salesforce Senior Data Engineer - R01561387

Brillio
0 of 0

Location

Bengaluru, IN

Salary

Not specified

Posted

3w ago

Job Type

Full Time

Experience

senior

Required Skills

About the Role

Senior Data Engineer

\n


Primary Skills
  • DataStream, ETL Fundamentals, SQL, SQL (Basic + Advanced), Python, Data Warehousing, Time Travel and Fail Safe, Snowpipe, SnowSQL, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures


Specialization
  • Snowflake Engineering: Data Engineer


Job requirements
  • Job Title: Senior Data Engineer – DBT & Python Experience: 5+ Years Job Summary We are seeking a highly skilled Senior Data Engineer with strong expertise in DBT, Python, and SQL to design, develop, and optimize scalable data pipelines and ETL processes. The ideal candidate should have hands-on experience in building robust data transformation workflows, working with Snowflake, and leveraging cloud platforms for analytics and reporting solutions. Key Responsibilities • Design, develop, and maintain DBT models, transformations, and SQL code for analytics and reporting. • Build scalable and efficient ETL pipelines using DBT and other relevant tools. • Write complex and high-performance SQL queries to process and analyze large datasets. • Develop clean, scalable, and efficient code using Python, with strong hands-on experience in Pandas and NumPy. • Optimize data pipelines for performance, scalability, and cost efficiency. • Troubleshoot and resolve ETL and data pipeline performance issues. • Develop scripts using Unix Shell scripting, Python, and other scripting tools for data extraction, transformation, and loading. • Write and optimize Snowflake SQL queries and support Snowflake implementations. • Work with orchestration tools such as Airflow or similar workflow management platforms. • Integrate user-facing elements into applications where required. • Collaborate with internal stakeholders to understand business requirements and translate them into technical solutions. • Ensure data quality, validation, and testing within DBT workflows. Required Skills & Qualifications • 6+ years of overall IT experience. • Proven hands-on experience with DBT (Data Build Tool) including model development, transformations, and testing. • Strong programming expertise in Python (mandatory). • Advanced proficiency in SQL, including writing complex queries on large datasets. • Hands-on experience in designing and maintaining ETL pipelines. • Experience in Unix Shell scripting. • Strong understanding of data warehousing concepts and best practices. Preferred Qualifications • Experience with Snowflake implementation and optimization. • Knowledge of Salesforce CDP. • Experience with Airflow or other data orchestration tools. • Exposure to cloud platforms such as AWS, GCP, or Azure. • Experience working with cloud storage solutions like S3, GCS, or Azure Blob Storage.


\n

About Brillio

Brillio is hiring for this full time position in Bengaluru, IN. Visit the job listing to learn more about the company and apply.