Azpipeline org

  Back to All Job Opportunities

Senior Big Data System Engineer
Deloitte     Gilbert, AZ 85295

Are you an experienced, passionate pioneer in technology? An industry solutions professional who wants to work in a collaborative environment. As an experienced Senior Systems Engineer - Big Data Platforms you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery. PDM practitioners are local to project locations, minimizing extensive travel, and provides you with a full career path within the firm.

Work you'll do/Responsibilities

This is an opportunity to join a fast-paced team that plays a key role in the overall success of our client's organization through technology enablement. You'll play a critical part in driving our client's technology vision forward and ensuring execution across multiple initiatives while working directly with our client's employees as well as Deloitte counterparts.

+ Design, Build, and Implement complex large-scale, and high-volume data platforms for downstream analytics and data science use.

+ Collaborate with fellow engineers, engineering program managers, data scientists, and business teams to understand the business landscape to effectively design technical requirements and solutions.

+ Platform components can range from core storage and processing capabilities and critical pipelines to exploratory analysis tools, model development and training infrastructure, to online inference architectures that react to real-time signals and preserve the privacy of our customers.

+ Architecting innovative solutions while playing a hands-on role to deliver products in a rapid and dynamic environment.

+ Develop tools to monitor system health, performance, and reliability.

The Team

As a part of the US Strategy & Analytics Offering Portfolio, the AI & Data Operations offering provides managed AI, Intelligent Automation, and Data DevOps services across the advise-implement-operate spectrum.



+ Shown deep internals understanding of Hadoop.

+ Extensive experience with Big Data, Spark, stream processing, and various database technologies (RDBMS, NoSQL, etc.).

+ 5+ years of software engineering experience, preferably with Python.

+ Experience with building scalable solutions utilizing Cloud and container-based technologies.

+ Comfortable with automation and DevOps methodologies.

+ Ability to communicate and collaborate effectively within the team and with our stakeholders.

+ Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience.

+ Limited immigration sponsorship may be available.

+ Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve.


+ A self-starter, with a getting-things-done attitude and a problem solver mindset.

+ Data analysis and data engineering skills are a plus.

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.

  Back to All Job Opportunities

Job Details

Employment Type

Full Time