Senior Databricks Cloud developer - Senior Associate
📍 Job Overview
- Job Title: Senior Databricks Cloud Developer - Senior Associate
- Company: State Street
- Location: Hyderabad, Telangana, India
- Job Type: Hybrid (On-premise / Hybrid)
- Category: Backend Developer, Data Engineer
- Date Posted: June 25, 2025
- Experience Level: 10+ years
🚀 Role Summary
- Develop and maintain Azure-based applications, data warehouses, and ETL backend systems using Databricks and Snowflake.
- Collaborate with business users to understand requirements and use cases, and translate them into efficient database solutions.
- Design, code, and optimize Snowflake objects, views, indexes, and partitions to ensure optimal performance and scalability.
- Work with large and complex data sets, handling JSON, ORC, PARQUET, and CSV files from various sources like AWS S3.
- Experience in Snowpipe, DBT, and scheduling/automating Snowflake jobs using TASK or Cron.
- Good knowledge of Snowflake Cloud Architecture, SnowSQL, and Snowpipe for continuous data loading.
- Strong problem-solving, communication, interpersonal, and analytical skills.
📝 Enhancement Note: This role requires a deep understanding of data warehousing concepts, schema types, and cloud-based data processing. Familiarity with big data technologies like Spark and Databricks is a plus.
💻 Primary Responsibilities
- Data Warehouse Development: Design, develop, and maintain data warehouses and ETL systems using Snowflake and Azure.
- Data Processing & Transformation: Process and transform large datasets using Databricks, Spark, and Snowpipe.
- Performance Tuning & Optimization: Optimize database queries, improve performance, and tune Snowflake objects and views.
- Collaboration & Communication: Work with business users, understand requirements, and provide technical solutions.
- Problem Solving: Identify, analyze, and resolve complex data-related issues.
- Agile Methodologies: Work within an agile team, following agile principles and best practices.
📝 Enhancement Note: This role involves a mix of technical development, data analysis, and collaboration with stakeholders. Strong problem-solving skills and the ability to communicate complex technical concepts effectively are crucial.
🎓 Skills & Qualifications
Education: Bachelor's degree in Computer Science, Engineering, or a related field. Relevant certifications (e.g., Snowpro, Azure) are a plus.
Experience: Minimum 8 years of experience in data warehousing, database application development, and cloud technologies. Proven experience in Snowflake (4+ years) and Azure.
Required Skills:
- Strong proficiency in Snowflake, including data modeling, query optimization, and performance tuning.
- Experience in Azure-based application development and data processing.
- Hands-on experience with Databricks, Spark, and big data technologies.
- Proficient in SQL, with experience in writing complex queries and optimizing performance.
- Experience in shell scripting and job scheduling tools like Autosys, Control-M, or Cron.
- Good knowledge of data warehousing concepts and schema types (Star, Snowflake).
- Strong communication, interpersonal, and analytical skills.
Preferred Skills:
- Experience with big data technologies like Hadoop, Hive, or Pig.
- Familiarity with ETL tools and data integration platforms.
- Knowledge of Oracle database and Proficient in SQL performance tuning.
- Experience with data governance, data quality, and metadata management.
- Familiarity with cloud architecture, data pipelines, and data streaming.
📝 Enhancement Note: While not explicitly mentioned, experience with data governance, data quality, and metadata management would be beneficial for this role, given the focus on data warehousing and ETL systems.
📊 Web Portfolio & Project Requirements
Portfolio Essentials:
- Include projects showcasing data warehousing and ETL system development using Snowflake and Azure.
- Demonstrate proficiency in data processing, transformation, and optimization using Databricks and Spark.
- Highlight problem-solving skills by presenting case studies of complex data-related issues and how you resolved them.
Technical Documentation:
- Document code quality, commenting, and data transformation processes.
- Include version control, deployment processes, and server configuration details.
- Showcase testing methodologies, performance metrics, and optimization techniques.
📝 Enhancement Note: Given the focus on data warehousing and ETL systems, ensure your portfolio includes well-documented projects demonstrating data processing, transformation, and optimization using relevant technologies.
💵 Compensation & Benefits
Salary Range: INR 25-35 lakhs per annum (Estimated based on industry standards for a senior data engineer role in Hyderabad with 10+ years of experience)
Benefits:
- Medical Care
- Insurance
- Savings Plans
- Flexible Work Programs
- Development Programs
- Educational Support
- Paid Volunteer Days
- Matching Gift Programs
Working Hours: 40 hours per week, with flexibility for project deadlines and maintenance windows.
📝 Enhancement Note: The salary range is estimated based on industry standards for a senior data engineer role in Hyderabad with 10+ years of experience. Research methodology includes Glassdoor, Payscale, and Indeed salary reports for similar roles in the region.
🎯 Team & Company Context
Company Culture:
- Industry: Financial Services
- Company Size: Large (Over 35,000 employees worldwide)
- Founded: 1792
- Team Structure: The Global Technology Services (GTS) team is responsible for driving the company's digital transformation and expanding business capabilities using advanced technologies like cloud, AI, and RPA. The team follows agile methodologies and values collaboration, innovation, and continuous learning.
- Development Methodology: Agile/Scrum methodologies, with a focus on sprint planning, code review, testing, and quality assurance.
Company Website: State Street
📝 Enhancement Note: State Street is a large, established financial services company with a strong focus on technology and innovation. The GTS team plays a vital role in driving the company's digital transformation and expanding business capabilities using advanced technologies.
📈 Career & Growth Analysis
Web Technology Career Level: Senior Associate - Responsible for leading projects, mentoring junior team members, and driving technical decisions related to data warehousing and ETL systems.
Reporting Structure: This role reports directly to the Manager, Data Engineering within the Global Technology Services (GTS) team.
Technical Impact: This role has a significant impact on the company's data infrastructure, ensuring accurate and efficient data processing, transformation, and storage. The successful candidate will contribute to the development and maintenance of data warehouses and ETL systems, enabling better data-driven decision-making and improving overall business performance.
Growth Opportunities:
- Technical Growth: Expand expertise in cloud-based data processing, big data technologies, and data governance.
- Leadership Growth: Develop leadership skills by mentoring junior team members and driving technical decisions within the team.
- Architecture Growth: Gain experience in designing and implementing data architecture, data pipelines, and data streaming solutions.
📝 Enhancement Note: This role offers significant growth opportunities in both technical and leadership domains. The successful candidate can expect to expand their expertise in cloud-based data processing, big data technologies, and data governance while developing their leadership and architecture skills.
🌐 Work Environment
Office Type: Hybrid - A mix of on-premise and remote work, with flexibility for employees to work from home or the office.
Office Location(s): Hyderabad, Telangana, India
Workspace Context:
- Collaborative workspace with a focus on agile team dynamics and cross-functional collaboration.
- Access to development tools, multiple monitors, and testing devices to support efficient data processing and transformation.
- Opportunities for knowledge sharing, technical mentoring, and continuous learning within the team and across the organization.
Work Schedule: Flexible work schedule with project deadline and maintenance window accommodations.
📝 Enhancement Note: State Street offers a collaborative and flexible work environment, with a focus on agile team dynamics and cross-functional collaboration. The hybrid work arrangement provides employees with the flexibility to work from home or the office, depending on their preferences and project requirements.
📄 Application & Technical Interview Process
Interview Process:
- Technical Assessment: A hands-on assessment focusing on data warehousing, ETL processes, and cloud-based data processing using Snowflake, Azure, and Databricks.
- Behavioral & Cultural Fit Assessment: An interview focusing on problem-solving skills, communication, interpersonal skills, and cultural fit within the team and organization.
- Final Evaluation: A final interview with the hiring manager or a panel of senior team members to assess technical skills, cultural fit, and career aspirations.
Portfolio Review Tips:
- Highlight projects demonstrating data warehousing, ETL system development, and data processing/transformation using Snowflake, Azure, and Databricks.
- Include case studies showcasing problem-solving skills and the ability to optimize data processing and transformation.
- Provide clear and concise documentation of code quality, data transformation processes, and testing methodologies.
Technical Challenge Preparation:
- Brush up on data warehousing concepts, schema types, and cloud-based data processing techniques.
- Practice writing complex SQL queries, optimizing performance, and tuning Snowflake objects and views.
- Familiarize yourself with Azure-based application development and data processing using Databricks and Spark.
ATS Keywords: Snowflake, Data Warehouse, Azure, Databricks, Spark, ETL, Cloud Architecture, Data Processing, Data Transformation, Performance Tuning, SQL, Agile, Problem Solving, Communication Skills, Analytical Skills
📝 Enhancement Note: The interview process for this role focuses on technical assessments, behavioral and cultural fit, and a final evaluation. Ensure you are well-versed in data warehousing, ETL processes, and cloud-based data processing using Snowflake, Azure, and Databricks. Highlight your problem-solving skills, communication, and interpersonal skills throughout the interview process.
🛠 Technology Stack & Web Infrastructure
Backend & Server Technologies:
- Snowflake: A cloud-based data warehousing platform used for storing, processing, and analyzing large datasets.
- Azure: A comprehensive cloud computing platform provided by Microsoft, offering a wide range of services for building, testing, deploying, and managing applications and services.
- Databricks: A data processing platform built on top of Apache Spark, designed for fast, efficient, and easy data processing in the cloud.
Development & DevOps Tools:
- Apache Spark: A fast and general engine for large-scale data processing.
- DBT: A command-line tool that enables data analysts and data engineers to transform data in their warehouses by simply writing select statements.
- Snowpipe: A fully-managed, continuous data ingestion service for Snowflake that enables real-time data loading and transformation.
- Shell Scripting: A scripting language used to automate administrative tasks and simplify complex processes.
- Autosys/Control-M/Cron: Job scheduling tools used to automate and manage tasks and workflows.
📝 Enhancement Note: This role requires a strong understanding of Snowflake, Azure, and Databricks, as well as proficiency in SQL, shell scripting, and job scheduling tools. Familiarity with big data technologies like Apache Spark and data transformation tools like DBT is a plus.
👥 Team Culture & Values
Web Development Values:
- Data-Driven Decision Making: Leverage data and analytics to inform decision-making and improve overall business performance.
- Continuous Learning & Improvement: Foster a culture of continuous learning and improvement, staying up-to-date with the latest technologies and best practices in data warehousing and ETL systems.
- Collaboration & Communication: Encourage open communication, active listening, and effective collaboration within the team and across the organization.
- Innovation & Creativity: Foster a culture of innovation and creativity, encouraging team members to explore new technologies and approaches to data processing and transformation.
Collaboration Style:
- Agile Team Dynamics: Work within an agile team, following agile methodologies and best practices for project planning, execution, and delivery.
- Cross-Functional Collaboration: Collaborate with business users, designers, and stakeholders to understand requirements, gather feedback, and ensure data-driven decision-making.
- Knowledge Sharing & Mentoring: Share knowledge and expertise with team members, providing mentoring and guidance to support their professional growth and development.
📝 Enhancement Note: State Street values data-driven decision-making, continuous learning and improvement, collaboration, and innovation. The team follows agile methodologies and encourages cross-functional collaboration, knowledge sharing, and mentoring to support the professional growth and development of team members.
⚡ Challenges & Growth Opportunities
Technical Challenges:
- Data Complexity: Work with large, complex datasets, handling JSON, ORC, PARQUET, and CSV files from various sources like AWS S3.
- Performance Optimization: Optimize database queries, improve performance, and tune Snowflake objects and views to ensure efficient data processing and transformation.
- Cloud Migration: Migrate data warehouses and ETL systems to the cloud, leveraging Azure-based services and Databricks for data processing and transformation.
- Data Governance: Implement data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations and standards.
Learning & Development Opportunities:
- Technical Skill Development: Expand expertise in cloud-based data processing, big data technologies, and data governance.
- Leadership Development: Develop leadership skills by mentoring junior team members and driving technical decisions within the team.
- Architecture Decision-Making: Gain experience in designing and implementing data architecture, data pipelines, and data streaming solutions.
📝 Enhancement Note: This role presents various technical challenges related to data complexity, performance optimization, cloud migration, and data governance. The successful candidate will have the opportunity to expand their technical skills, develop leadership skills, and gain experience in architecture decision-making.
💡 Interview Preparation
Technical Questions:
- Data Warehousing & ETL: Explain data warehousing concepts, schema types (Star, Snowflake), and ETL processes. Describe your experience with Snowflake, Azure, and Databricks in developing and maintaining data warehouses and ETL systems.
- Data Processing & Transformation: Discuss your experience with data processing and transformation using Databricks, Spark, and Snowpipe. Explain how you optimize data processing and transformation for improved performance and efficiency.
- Performance Tuning & Optimization: Describe your approach to performance tuning and optimization, focusing on SQL queries, Snowflake objects and views, and data processing techniques using Databricks and Spark.
Company & Culture Questions:
- Data-Driven Decision Making: Explain how you leverage data and analytics to inform decision-making and improve overall business performance.
- Continuous Learning & Improvement: Describe your approach to continuous learning and improvement, focusing on staying up-to-date with the latest technologies and best practices in data warehousing and ETL systems.
- Collaboration & Communication: Discuss your experience with cross-functional collaboration, active listening, and effective communication within the team and across the organization.
Portfolio Presentation Strategy:
- Data Warehouse & ETL Projects: Highlight projects demonstrating data warehousing and ETL system development using Snowflake, Azure, and Databricks.
- Data Processing & Transformation: Showcase your ability to process and transform large datasets using Databricks, Spark, and Snowpipe, with a focus on performance optimization and efficiency.
- Problem-Solving Skills: Present case studies demonstrating your problem-solving skills, with a focus on data-related issues and the ability to optimize data processing and transformation.
📝 Enhancement Note: The interview process for this role focuses on technical assessments, behavioral and cultural fit, and a final evaluation. Ensure you are well-versed in data warehousing, ETL processes, and cloud-based data processing using Snowflake, Azure, and Databricks. Highlight your problem-solving skills, communication, and interpersonal skills throughout the interview process.
📌 Application Steps
To apply for this Senior Databricks Cloud Developer - Senior Associate position at State Street:
- Update Your Portfolio: Highlight projects demonstrating data warehousing, ETL system development, and data processing/transformation using Snowflake, Azure, and Databricks. Include case studies showcasing problem-solving skills and the ability to optimize data processing and transformation.
- Tailor Your Resume: Emphasize your experience with data warehousing, ETL processes, and cloud-based data processing using Snowflake, Azure, and Databricks. Include relevant keywords and skills to optimize your resume for the application tracking system (ATS).
- Prepare for Technical Challenges: Brush up on data warehousing concepts, schema types, and cloud-based data processing techniques. Practice writing complex SQL queries, optimizing performance, and tuning Snowflake objects and views. Familiarize yourself with Azure-based application development and data processing using Databricks and Spark.
- Research the Company: Understand State Street's focus on data-driven decision-making, continuous learning and improvement, collaboration, and innovation. Familiarize yourself with the company's products, services, and industry presence to demonstrate your cultural fit and enthusiasm for the role.
⚠️ Important Notice: This enhanced job description includes AI-generated insights and web development/server administration industry-standard assumptions. All details should be verified directly with the hiring organization before making application decisions.
Application Requirements
Candidates must have a minimum of 8 years of relevant experience in Data Warehousing and Database application development, with hands-on experience in Snowflake and Azure. Strong communication skills and the ability to work both independently and within a team are essential.