Principal Big Data Engineer (Sao Paolo)
This role will be based in our QuantumBlack hub in Sao Paolo and you will work as part of McKinsey & Company, joining a highly collaborative team of exceptionally talented Data Scientists, Data Architects, and Engineers.
QuantumBlack helps companies use data to drive decisions. We combine business experience, expertise in large-scale data analysis and visualization, and advanced software engineering know-how to deliver results. From aerospace to finance to Formula One, we help companies prototype, develop, and deploy bespoke data science and data visualisation solutions to make better decisions.
As a Principal Big Data Engineer you're passionate about data and the opportunity it provides to organisations. You ‘get’ Big Data and Cloud computing for more advanced data processing and Analytics, and are excited about these technologies. You are equally comfortable talking to senior client stakeholders to understand their data as well as designing the ingestion process to store the data locally and preparing it for Data Analytics. You have experience leading client projects and in handling vast amounts of data – working on database design and development, data integration and ingestion, designing ETL architectures using a variety of ETL tools and techniques. You are someone with a drive to implement the best possible solutions for clients and work closely with a highly skilled Data Science team.
What you'll do
- Lead on projects from a data engineering perspective, working with our clients to model their data landscape, obtain data extracts and define secure data exchange approaches
- Plan and execute secure, good practice data integration strategies and approaches
- Acquire, ingest, and process data from multiple sources and systems into Big Data platforms
- Create and manage data environments in the Cloud
- Collaborate with our data scientists to map data fields to hypotheses and curate, wrangle, and prepare data for use in their advanced analytical models
- Have a strong understanding of Information Security principles to ensure compliant handling and management of client data
- This is a fantastic opportunity to be involved in end-to-end data management for cutting edge Advanced Analytics and Data Science
- Commercial experience leading on client-facing projects, including working in close-knit teams
- Experience and interest in Big Data technologies (Hadoop / Spark / NoSQL DBs)
- Experience working on projects within the cloud ideally AWS or Azure
- A proven ability in clearly communicating complex solutions
- Experience working on lively projects and a consulting setting, often working on different and multiple projects at the same time
- Strong development background with experience in at least two scripting, object oriented or functional programming language, etc. SQL, Python, Java, Scala, C#, R
- Data Warehousing experience, building operational ETL data pipelines across a number of sources, and constructing relational and dimensional data models
- Experience in at least one ETL tool (e.g. Informatica, Talend, Pentaho, DataStage)
- The ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets
- Excellent interpersonal skills when interacting with clients in a clear, timely, and professional manner.
- A deep personal motivation to always produce outstanding work for your clients and colleagues
- Excel in team collaboration and working with others from diverse skill-sets and backgrounds